903 resultados para Geo-statistical model
Resumo:
We recently generated a knock-in mouse model (PYGM p.R50X/p.R50X) of McArdle disease (myophosphorylase deficiency). One mechanistic approach to unveil the molecular alterations caused by myophosphorylase deficiency, which is arguably the paradigm of 'exercise intolerance', is to compare the skeletal-muscle tissue of McArdle, heterozygous, and healthy (wild type (wt)) mice. We analyzed in quadriceps muscle of p.R50X/p.R50X (n=4), p.R50X/wt (n=6) and wt/wt mice (n=5) (all male, 8 wk-old) molecular markers of energy-sensing pathways, oxidative phosphorylation (OXPHOS) and autophagy/proteasome systems, oxidative damage and sarcoplamic reticulum (SR) Ca handling. We found a significant group effect for total AMPK (tAMPK) and ratio of phosphorylated (pAMPK)/tAMPK (P=0.012 and 0.033), with higher mean values in p.R50X/p.R50X mice vs. the other two groups. The absence of massive accumulation of ubiquitinated proteins, autophagosomes or lysosomes in p.R50X/p.R50X mice suggested no major alterations in autophagy/proteasome systems. Citrate synthase activity was lower in p.R50X/p.R50X mice vs. the other two groups (P=0.036) but no statistical effect existed for respiratory chain complexes. We found higher levels of 4-hydroxy-2-nonenal-modified proteins in p.R50X/p.R50X and p.R50X/wt mice compared with the wt/wt group (P=0.011). Sarco(endo)plasmic reticulum ATPase 1 (SERCA1) levels detected at 110kDa tended to be higher in p.R50X/p.R50X and p.R50X/wt mice compared with wt/wt animals (P=0.076), but their enzyme activity was normal. We also found an accumulation of phosphorylated SERCA1 in p.R50X/p.R50X animals. Myophosphorylase deficiency causes alterations in sensory energetic pathways together with some evidence of oxidative damage and alterations in Ca handling but with no major alterations in OXPHOS capacity or autophagy/ubiquitination pathways, which suggests that the muscle tissue of patients is likely to adapt overall favorably to exercise training interventions.
Resumo:
A method for deformable shape detection and recognition is described. Deformable shape templates are used to partition the image into a globally consistent interpretation, determined in part by the minimum description length principle. Statistical shape models enforce the prior probabilities on global, parametric deformations for each object class. Once trained, the system autonomously segments deformed shapes from the background, while not merging them with adjacent objects or shadows. The formulation can be used to group image regions based on any image homogeneity predicate; e.g., texture, color, or motion. The recovered shape models can be used directly in object recognition. Experiments with color imagery are reported.
Resumo:
A neural model is presented of how cortical areas V1, V2, and V4 interact to convert a textured 2D image into a representation of curved 3D shape. Two basic problems are solved to achieve this: (1) Patterns of spatially discrete 2D texture elements are transformed into a spatially smooth surface representation of 3D shape. (2) Changes in the statistical properties of texture elements across space induce the perceived 3D shape of this surface representation. This is achieved in the model through multiple-scale filtering of a 2D image, followed by a cooperative-competitive grouping network that coherently binds texture elements into boundary webs at the appropriate depths using a scale-to-depth map and a subsequent depth competition stage. These boundary webs then gate filling-in of surface lightness signals in order to form a smooth 3D surface percept. The model quantitatively simulates challenging psychophysical data about perception of prolate ellipsoids (Todd and Akerstrom, 1987, J. Exp. Psych., 13, 242). In particular, the model represents a high degree of 3D curvature for a certain class of images, all of whose texture elements have the same degree of optical compression, in accordance with percepts of human observers. Simulations of 3D percepts of an elliptical cylinder, a slanted plane, and a photo of a golf ball are also presented.
Resumo:
A method to solve the stationary state probability is presented for the first-order bang-bang phase-locked loop (BBPLL) with nonzero loop delay. This is based on a delayed Markov chain model and a state How diagram for tracing the state history due to the loop delay. As a result, an eigenequation is obtained, and its closed form solutions are derived for some cases. After obtaining the state probability, statistical characteristics such as mean gain of the binary phase detector and timing error variance are calculated and demonstrated.
Resumo:
Background: Obesity is the most important health challenge faced at a global level and represents a rapidly growing problem to the health of populations. Given the escalating global health problem of obesity and its co-morbidities, the need to re-appraise its management is more compelling than ever. The normalisation of obesity within our society and the acceptance of higher body weights have led to individuals being unaware of the reality of their weight status and gravity of this situation. Recognition of the problem is a key component of obesity management and it remains especially crucial to address this issue. A large amount of research has been undertaken on obesity however, limited research has been undertaken using the Health Belief Model. Aim: The aim of the research was to determine factors relating to motivation to change behaviour in individuals who perceive themselves to be overweight and investigate whether the constructs of the Health Belief Model help to explain motivation to change behaviour. Method: The research design was quantitative, correlational and cross-sectional. The design was guided by the Health Belief Model. Data Collection: Data were collected online using a multi-section and multi-item questionnaire, developed from a review of the theoretical and empirical research. Descriptive and inferential statistical analyses were employed to describe relationships between variables. Sample: A sample of 202 men and women who perceived themselves to be overweight participated in the research. Results: Following multivariate regression analysis, perceived barriers to weight loss and perceived benefits of weight loss were significant predictors of motivation to change behaviour. The perceived barriers to weight loss which were significant were psychological barriers to weight loss (p =<0.019) and environmental barriers to physical activity (p=<0.032).The greatest predictor of motivation to change behaviour was the perceived benefits of weight loss (p<0.001). Perceived susceptibility to obesity and perceived severity of obesity did not emerge as significant predictors in this model. Total variance explained by the model was 33.5%. Conclusion: Perceived barriers to weight loss and perceived benefits of weight loss are important determinants of motivation to change behaviour. The current study demonstrated the limited applicability of the Health Belief Model constructs to motivation to change behaviour, as not all core dimensions proved significant predictors of the dependant variable.
Resumo:
The combinatorial model of nuclear level densities has now reached a level of accuracy comparable to that of the best global analytical expressions without suffering from the limits imposed by the statistical hypothesis on which the latter expressions rely. In particular, it provides, naturally, non-Gaussian spin distribution as well as non-equipartition of parities which are known to have an impact on cross section predictions at low energies [1, 2, 3]. Our previous global models developed in Refs. [1, 2] suffered from deficiencies, in particular in the way the collective effects - both vibrational and rotational - were treated. We have recently improved this treatment using simultaneously the single-particle levels and collective properties predicted by a newly derived Gogny interaction [4], therefore enabling a microscopic description of energy-dependent shell, pairing and deformation effects. In addition for deformed nuclei, the transition to sphericity is coherently taken into account on the basis of a temperature-dependent Hartree-Fock calculation which provides at each temperature the structure properties needed to build the level densities. This new method is described and shown to give promising results with respect to available experimental data.
Resumo:
BACKGROUND: The rate of emergence of human pathogens is steadily increasing; most of these novel agents originate in wildlife. Bats, remarkably, are the natural reservoirs of many of the most pathogenic viruses in humans. There are two bat genome projects currently underway, a circumstance that promises to speed the discovery host factors important in the coevolution of bats with their viruses. These genomes, however, are not yet assembled and one of them will provide only low coverage, making the inference of most genes of immunological interest error-prone. Many more wildlife genome projects are underway and intend to provide only shallow coverage. RESULTS: We have developed a statistical method for the assembly of gene families from partial genomes. The method takes full advantage of the quality scores generated by base-calling software, incorporating them into a complete probabilistic error model, to overcome the limitation inherent in the inference of gene family members from partial sequence information. We validated the method by inferring the human IFNA genes from the genome trace archives, and used it to infer 61 type-I interferon genes, and single type-II interferon genes in the bats Pteropus vampyrus and Myotis lucifugus. We confirmed our inferences by direct cloning and sequencing of IFNA, IFNB, IFND, and IFNK in P. vampyrus, and by demonstrating transcription of some of the inferred genes by known interferon-inducing stimuli. CONCLUSION: The statistical trace assembler described here provides a reliable method for extracting information from the many available and forthcoming partial or shallow genome sequencing projects, thereby facilitating the study of a wider variety of organisms with ecological and biomedical significance to humans than would otherwise be possible.
Resumo:
Technological advances in genotyping have given rise to hypothesis-based association studies of increasing scope. As a result, the scientific hypotheses addressed by these studies have become more complex and more difficult to address using existing analytic methodologies. Obstacles to analysis include inference in the face of multiple comparisons, complications arising from correlations among the SNPs (single nucleotide polymorphisms), choice of their genetic parametrization and missing data. In this paper we present an efficient Bayesian model search strategy that searches over the space of genetic markers and their genetic parametrization. The resulting method for Multilevel Inference of SNP Associations, MISA, allows computation of multilevel posterior probabilities and Bayes factors at the global, gene and SNP level, with the prior distribution on SNP inclusion in the model providing an intrinsic multiplicity correction. We use simulated data sets to characterize MISA's statistical power, and show that MISA has higher power to detect association than standard procedures. Using data from the North Carolina Ovarian Cancer Study (NCOCS), MISA identifies variants that were not identified by standard methods and have been externally "validated" in independent studies. We examine sensitivity of the NCOCS results to prior choice and method for imputing missing data. MISA is available in an R package on CRAN.
Resumo:
In the mnemonic model of posttraumatic stress disorder (PTSD), the current memory of a negative event, not the event itself, determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; American Psychiatric Association, 2000). The model accounts for important and reliable findings that are often inconsistent with the current diagnostic view and that have been neglected by theoretical accounts of the disorder, including the following observations. The diagnosis needs objective information about the trauma and peritraumatic emotions but uses retrospective memory reports that can have substantial biases. Negative events and emotions that do not satisfy the current diagnostic criteria for a trauma can be followed by symptoms that would otherwise qualify for PTSD. Predisposing factors that affect the current memory have large effects on symptoms. The inability-to-recall-an-important-aspect-of-the-trauma symptom does not correlate with other symptoms. Loss or enhancement of the trauma memory affects PTSD symptoms in predictable ways. Special mechanisms that apply only to traumatic memories are not needed, increasing parsimony and the knowledge that can be applied to understanding PTSD.
Resumo:
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.
Resumo:
Forest fires can cause extensive damage to natural resources and properties. They can also destroy wildlife habitat, affect the forest ecosystem and threaten human lives. In this paper extreme wildland fires are analysed using a point process model for extremes. The model based on a generalised Pareto distribution is used to model data on acres of wildland burnt by extreme fire in the US since 1825. A semi-parametric smoothing approach is adapted with maximum likelihood method to estimate model parameters.
Resumo:
Ocean biogeochemistry (OBGC) models span a wide variety of complexities, including highly simplified nutrient-restoring schemes, nutrient–phytoplankton–zooplankton–detritus (NPZD) models that crudely represent the marine biota, models that represent a broader trophic structure by grouping organisms as plankton functional types (PFTs) based on their biogeochemical role (dynamic green ocean models) and ecosystem models that group organisms by ecological function and trait. OBGC models are now integral components of Earth system models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here we present an intercomparison of six OBGC models that were candidates for implementation within the next UK Earth system model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the ocean general circulation model Nucleus for European Modelling of the Ocean (NEMO) and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform all other models across all metrics. Nonetheless, the simpler models are broadly closer to observations across a number of fields and thus offer a high-efficiency option for ESMs that prioritise high-resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low-resolution climate dynamics and high-complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry–climate interactions.
Resumo:
This paper reports a study carried out to develop a self-compacting fibre reinforced concrete containing a high fibre content with slurry infiltrated fibre concrete (SIFCON). The SIFCON was developed with 10% of steel fibres which are infiltrated by self-compacting cement slurry without any vibration. Traditionally, the infiltration of the slurry into the layer of fibres is carried out under intensive vibration. A two-level fractional factorial design was used to optimise the properties of cement-based slurries with four independent variables, such as dosage of silica fume, dosage of superplasticiser, sand content, and water/cement ratio (W/C). Rheometer, mini-slump test, Lombardi plate cohesion meter, J-fibre penetration test, and induced bleeding were used to assess the behaviour of fresh cement slurries. The compressive strengths at 7 and 28 days were also measured. The statistical models are valid for slurries made with W/C of 0.40 to 0.50, 50 to 100% of sand by mass of cement, 5 to 10% of silica fume by mass of cement, and SP dosage of 0.6 to 1.2% by mass of cement. This model makes it possible to evaluate the effect of individual variables on measured parameters of fresh cement slurries. The proposed models offered useful information to understand trade-offs between mix variables and compare the responses obtained from various test methods in order to optimise self-compacting SIFCON.
Resumo:
This paper provides a summary of our studies on robust speech recognition based on a new statistical approach – the probabilistic union model. We consider speech recognition given that part of the acoustic features may be corrupted by noise. The union model is a method for basing the recognition on the clean part of the features, thereby reducing the effect of the noise on recognition. To this end, the union model is similar to the missing feature method. However, the two methods achieve this end through different routes. The missing feature method usually requires the identity of the noisy data for noise removal, while the union model combines the local features based on the union of random events, to reduce the dependence of the model on information about the noise. We previously investigated the applications of the union model to speech recognition involving unknown partial corruption in frequency band, in time duration, and in feature streams. Additionally, a combination of the union model with conventional noise-reduction techniques was studied, as a means of dealing with a mixture of known or trainable noise and unknown unexpected noise. In this paper, a unified review, in the context of dealing with unknown partial feature corruption, is provided into each of these applications, giving the appropriate theory and implementation algorithms, along with an experimental evaluation.