861 resultados para Overall Likelihood and Posterior
Resumo:
Individuals with dysphagia may be prescribed thickened fluids to promote a safer and more successful swallow. Starch-based thickening agents are often employed; however, these exhibit great variation in consistency. The aim of this study was to compare viscosity and the rheological profile parameters complex (G*), viscous (G″), and elastic modulus (G′) over a range of physiological shear rates. UK commercially available dysphagia products at “custard” consistency were examined. Commercially available starch-based dysphagia products were prepared according to manufacturers’ instructions; the viscosity and rheological parameters were tested on a CVOR Rheometer. At a measured shear rate of 50 s−1, all products fell within the viscosity limits defined according to the National Dysphagia Diet Task Force guidelines. However, at lower shear rates, large variations in viscosity were observed. Rheological parameters G*, G′, and G″ also demonstrated considerable differences in both overall strength and rheological behavior between different batches of the same product and different product types. The large range in consistency and changes in the overall structure of the starch-based products over a range of physiological shear rates show that patients could be receiving fluids with very different characteristics from that advised. This could have detrimental effects on their ability to swallow.
Resumo:
Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware or faulty system or application software. Monitoring systems such as GridRM provide the ability to connect to any number of different types of monitoring agents and provide different views of the system, based on a client's particular preferences. Web 2.0 technologies, and in particular 'mashups', are emerging as a promising technique for rapidly constructing rich user interfaces, that combine and present data in intuitive ways. This paper describes a Web 2.0 user interface that was created to expose resource data harvested by the GridRM resource monitoring system.
Resumo:
The usefulness of motor subtypes of delirium is unclear due to inconsistency in subtyping methods and a lack of validation with objective measures of activity. The activity of 40 patients was measured over 24 h with a commercial accelerometer-based activity monitor. Accelerometry data from patients with DSM-IV delirium that were readily divided into hyperactive, hypoactive and mixed motor subtypes, were used to create classification trees that were Subsequently applied to the remaining cohort to define motoric subtypes. The classification trees used the periods of sitting/lying, standing, stepping and number of postural transitions as measured by the activity monitor as determining factors from which to classify the delirious cohort. The use of a classification system shows how delirium subtypes can be categorised in relation to overall activity and postural changes, which was one of the most discriminating measures examined. The classification system was also implemented to successfully define other patient motoric subtypes. Motor subtypes of delirium defined by observed ward behaviour differ in electronically measured activity levels. Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.
Resumo:
The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.
Resumo:
Bayesian, maximum-likelihood, and maximum-parsimony phylogenies, constructed using nucleotide sequences from the plastid gene region trnK-matK, are employed to investigate relationships within the Cactaceae. These phylogenies sample 666 plants representing 532 of the 1438 species recognized in the family. All four subfamilies, all nine tribes, and 69% of currently recognized genera of Cactaceae are sampled. We found strong support for three of the four currently recognized subfamilies, although relationships between subfamilies were not well defined. Major clades recovered within the largest subfamilies, Opuntioideae and Cactoideae, are reviewed; only three of the nine currently accepted tribes delimited within these subfamilies, the Cacteae, Rhipsalideae, and Opuntieae, are monophyletic, although the Opuntieae were recovered in only the Bayesian and maximum-likelihood analyses, not in the maximum-parsimony analysis, and more data are needed to reveal the status of the Cylindropuntieae, which may yet be monophyletic. Of the 42 genera with more than one exemplar in our study, only 17 were monophyletic; 14 of these genera were from subfamily Cactoideae and three from subfamily Opuntioideae. We present a synopsis of the status of the currently recognized genera
Resumo:
This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets.
Resumo:
In nonhuman species, testosterone is known to have permanent organizing effects early in life that predict later expression of sex differences in brain and behavior. However, in humans, it is still unknown whether such mechanisms have organizing effects on neural sexual dimorphism. In human males, we show that variation in fetal testosterone (FT) predicts later local gray matter volume of specific brain regions in a direction that is congruent with sexual dimorphism observed in a large independent sample of age-matched males and females from the NIH Pediatric MRI Data Repository. Right temporoparietal junction/posterior superior temporal sulcus (RTPJ/pSTS), planum temporale/parietal operculum (PT/PO), and posterior lateral orbitofrontal cortex (plOFC) had local gray matter volume that was both sexually dimorphic and predicted in a congruent direction by FT. That is, gray matter volume in RTPJ/pSTS was greater for males compared to females and was positively predicted by FT. Conversely, gray matter volume in PT/PO and plOFC was greater in females compared to males and was negatively predicted by FT. Subregions of both amygdala and hypothalamus were also sexually dimorphic in the direction of Male > Female, but were not predicted by FT. However, FT positively predicted gray matter volume of a non-sexually dimorphic subregion of the amygdala. These results bridge a long-standing gap between human and nonhuman species by showing that FT acts as an organizing mechanism for the development of regional sexual dimorphism in the human brain.
Resumo:
The rapid growth of non-listed real estate funds over the last several years has contributed towards establishing this sector as a major investment vehicle for gaining exposure to commercial real estate. Academic research has not kept up with this development, however, as there are still only a few published studies on non-listed real estate funds. This paper aims to identify the factors driving the total return over a seven-year period. Influential factors tested in our analysis include the weighted underlying direct property returns in each country and sector as well as fund size, investment style gearing and the distribution yield. Furthermore, we analyze the interaction of non-listed real estate funds with the performance of the overall economy and that of competing asset classes and found that lagged GDP growth and stock market returns as well as contemporaneous government bond rates are significant and positive predictors of annual fund performance.
Resumo:
Background and Objectives Low self-esteem (LSE) is associated with psychiatric disorder, and is distressing and debilitating in its own right. Hence, it is frequent target for treatment in cognitive behavioural interventions, yet it has rarely been the primary focus for intervention. This paper reports on a preliminary randomized controlled trial of cognitive behaviour therapy (CBT) for LSE using Fennell’s (1997) cognitive conceptualisation and transdiagnostic treatment approach ( [Fennell, 1997] and [Fennell, 1999]). Methods Twenty-two participants were randomly allocated to either immediate treatment (IT) (n = 11) or to a waitlist condition (WL) (n = 11). Treatment consisted of 10 sessions of individual CBT accompanied by workbooks. Participants allocated to the WL condition received the CBT intervention once the waitlist period was completed and all participants were followed up 11 weeks after completing CBT. Results The IT group showed significantly better functioning than the WL group on measures of LSE, overall functioning and depression and had fewer psychiatric diagnoses at the end of treatment. The WL group showed the same pattern of response to CBT as the group who had received CBT immediately. All treatment gains were maintained at follow-up assessment. Limitations The sample size is small and consists mainly of women with a high level of educational attainment and the follow-up period was relatively short. Conclusions These preliminary findings suggest that a focused, brief CBT intervention can be effective in treating LSE and associated symptoms and diagnoses in a clinically representative group of individuals with a range of different and co-morbid disorders.
Resumo:
This paper presents an assessment of the impacts of climate change on a series of indicators of hydrological regimes across the global domain, using a global hydrological model run with climate scenarios constructed using pattern-scaling from 21 CMIP3 (Coupled Model Intercomparison Project Phase 3) climate models. Changes are compared with natural variability, with a significant change being defined as greater than the standard deviation of the hydrological indicator in the absence of climate change. Under an SRES (Special Report on Emissions Scenarios) A1b emissions scenario, substantial proportions of the land surface (excluding Greenland and Antarctica) would experience significant changes in hydrological behaviour by 2050; under one climate model scenario (Hadley Centre HadCM3), average annual runoff increases significantly over 47% of the land surface and decreases over 36%; only 17% therefore sees no significant change. There is considerable variability between regions, depending largely on projected changes in precipitation. Uncertainty in projected river flow regimes is dominated by variation in the spatial patterns of climate change between climate models (hydrological model uncertainty is not included). There is, however, a strong degree of consistency in the overall magnitude and direction of change. More than two-thirds of climate models project a significant increase in average annual runoff across almost a quarter of the land surface, and a significant decrease over 14%, with considerably higher degrees of consistency in some regions. Most climate models project increases in runoff in Canada and high-latitude eastern Europe and Siberia, and decreases in runoff in central Europe, around the Mediterranean, the Mashriq, central America and Brasil. There is some evidence that projecte change in runoff at the regional scale is not linear with change in global average temperature change. The effects of uncertainty in the rate of future emissions is relatively small
Resumo:
Many applications, such as intermittent data assimilation, lead to a recursive application of Bayesian inference within a Monte Carlo context. Popular data assimilation algorithms include sequential Monte Carlo methods and ensemble Kalman filters (EnKFs). These methods differ in the way Bayesian inference is implemented. Sequential Monte Carlo methods rely on importance sampling combined with a resampling step, while EnKFs utilize a linear transformation of Monte Carlo samples based on the classic Kalman filter. While EnKFs have proven to be quite robust even for small ensemble sizes, they are not consistent since their derivation relies on a linear regression ansatz. In this paper, we propose another transform method, which does not rely on any a priori assumptions on the underlying prior and posterior distributions. The new method is based on solving an optimal transportation problem for discrete random variables. © 2013, Society for Industrial and Applied Mathematics
Resumo:
We investigated selective impairments in the production of regular and irregular past tense by examining language performance and lesion sites in a sample of twelve stroke patients. A disadvantage in regular past tense production was observed in six patients when phonological complexity was greater for regular than irregular verbs, and in three patients when phonological complexity was closely matched across regularity. These deficits were not consistently related to grammatical difficulties or phonological errors but were consistently related to lesion site. All six patients with a regular past tense disadvantage had damage to the left ventral pars opercularis (in the inferior frontal cortex), an area associated with articulatory sequencing in prior functional imaging studies. In addition, those that maintained a disadvantage for regular verbs when phonological complexity was controlled had damage to the left ventral supramarginal gyrus (in the inferior parietal lobe), an area associated with phonological short-term memory. When these frontal and parietal regions were spared in patients who had damage to subcortical (n = 2) or posterior temporo-parietal regions (n = 3), past tense production was relatively unimpaired for both regular and irregular forms. The remaining (12th) patient was impaired in producing regular past tense but was significantly less accurate when producing irregular past tense. This patient had frontal, parietal, subcortical and posterior temporo-parietal damage, but was distinguished from the other patients by damage to the left anterior temporal cortex, an area associated with semantic processing. We consider how our lesion site and behavioral observations have implications for theoretical accounts of past tense production.
Resumo:
During glacial periods, dust deposition rates and inferred atmospheric concentrations were globally much higher than present. According to recent model results, the large enhancement of atmospheric dust content at the last glacial maximum (LGM) can be explained only if increases in the potential dust source areas are taken into account. Such increases are to be expected, due to effects of low precipitation and low atmospheric (CO2) on plant growth. Here the modelled three-dimensional dust fields from Mahowald et al. and modelled seasonally varying surface-albedo fields derived in a parallel manner, are used to quantify the mean radiative forcing due to modern (non-anthropogenic) and LGM dust. The effect of mineralogical provenance on the radiative properties of the dust is taken into account, as is the range of optical properties associated with uncertainties about the mixing state of the dust particles. The high-latitude (poleward of 45°) mean change in forcing (LGM minus modern) is estimated to be small (–0.9 to +0.2 W m–2), especially when compared to nearly –20 W m–2 due to reflection from the extended ice sheets. Although the net effect of dust over ice sheets is a positive forcing (warming), much of the simulated high-latitude dust was not over the ice sheets, but over unglaciated regions close to the expanded dust source region in central Asia. In the tropics the change in forcing is estimated to be overall negative, and of similarly large magnitude (–2.2 to –3.2 W m–2) to the radiative cooling effect of low atmospheric (CO2). Thus, the largest long-term climatic effect of the LGM dust is likely to have been a cooling of the tropics. Low tropical sea-surface temperatures, low atmospheric (CO2) and high atmospheric dust loading may be mutually reinforcing due to multiple positive feedbacks, including the negative radiative forcing effect of dust.
Resumo:
Whole-genome sequencing (WGS) could potentially provide a single platform for extracting all the information required to predict an organism’s phenotype. However, its ability to provide accurate predictions has not yet been demonstrated in large independent studies of specific organisms. In this study, we aimed to develop a genotypic prediction method for antimicrobial susceptibilities. The whole genomes of 501 unrelated Staphylococcus aureus isolates were sequenced, and the assembled genomes were interrogated using BLASTn for a panel of known resistance determinants (chromosomal mutations and genes carried on plasmids). Results were compared with phenotypic susceptibility testing for 12 commonly used antimicrobial agents (penicillin, methicillin, erythromycin, clindamycin, tetracycline, ciprofloxacin, vancomycin, trimethoprim, gentamicin, fusidic acid, rifampin, and mupirocin) performed by the routine clinical laboratory. We investigated discrepancies by repeat susceptibility testing and manual inspection of the sequences and used this information to optimize the resistance determinant panel and BLASTn algorithm. We then tested performance of the optimized tool in an independent validation set of 491 unrelated isolates, with phenotypic results obtained in duplicate by automated broth dilution (BD Phoenix) and disc diffusion. In the validation set, the overall sensitivity and specificity of the genomic prediction method were 0.97 (95% confidence interval [95% CI], 0.95 to 0.98) and 0.99 (95% CI, 0.99 to 1), respectively, compared to standard susceptibility testing methods. The very major error rate was 0.5%, and the major error rate was 0.7%. WGS was as sensitive and specific as routine antimicrobial susceptibility testing methods. WGS is a promising alternative to culture methods for resistance prediction in S. aureus and ultimately other major bacterial pathogens.
Resumo:
Monte Carlo algorithms often aim to draw from a distribution π by simulating a Markov chain with transition kernel P such that π is invariant under P. However, there are many situations for which it is impractical or impossible to draw from the transition kernel P. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace P by an approximation Pˆ. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ’close’ the chain given by the transition kernel Pˆ is to the chain given by P . We apply these results to several examples from spatial statistics and network analysis.