957 resultados para Statistical approach
Resumo:
The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.
Resumo:
This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.
Resumo:
The antioxidant activity of natural and synthetic compounds was evaluated using five in vitro methods: ferric reducing/antioxidant power (FRAP), 2,2-diphenyl-1-picrylhydradzyl (DPPH), oxygen radical absorption capacity (ORAL), oxidation of an aqueous dispersion of linoleic acid accelerated by azo-initiators (LAOX), and oxidation of a meat homogenate submitted to a thermal treatment (TBARS). All results were expressed as Trolox equivalents. The application of multivariate statistical techniques suggested that the phenolic compounds (caffeic acid, carnosic acid, genistein and resveratrol), beyond their high antioxidant activity measured by the DPPH, FRAP and TBARS methods, showed the highest ability to react with the radicals in the ORAC methodology, compared to the other compounds evaluated in this study (ascorbic acid, erythorbate, tocopherol, BHT, Trolox, tryptophan, citric acid, EDTA, glutathione, lecithin, methionine and tyrosine). This property was significantly correlated with the number of phenolic rings and catecholic structure present in the molecule. Based on a multivariate analysis, it is possible to select compounds from different clusters and explore their antioxidant activity interactions in food products.
Resumo:
Genetic recombination can produce heterogeneous phylogenetic histories within a set of homologous genes. Delineating recombination events is important in the study of molecular evolution, as inference of such events provides a clearer picture of the phylogenetic relationships among different gene sequences or genomes. Nevertheless, detecting recombination events can be a daunting task, as the performance of different recombination-detecting approaches can vary, depending on evolutionary events that take place after recombination. We recently evaluated the effects of post-recombination events on the prediction accuracy of recombination-detecting approaches using simulated nucleotide sequence data. The main conclusion, supported by other studies, is that one should not depend on a single method when searching for recombination events. In this paper, we introduce a two-phase strategy, applying three statistical measures to detect the occurrence of recombination events, and a Bayesian phylogenetic approach in delineating breakpoints of such events in nucleotide sequences. We evaluate the performance of these approaches using simulated data, and demonstrate the applicability of this strategy to empirical data. The two-phase strategy proves to be time-efficient when applied to large datasets, and yields high-confidence results.
Resumo:
The present paper proposes an approach to obtaining the activation energy distribution for chemisorption of oxygen onto carbon surfaces, while simultaneously allowing for the activation energy dependence of the pre-exponential factor of the rate constant. Prior studies in this area have considered this factor to be uniform, thereby biasing estimated distributions. The results show that the derived activation energy distribution is not sensitive to the chemisorption mechanism because of the step function like property of the coverage. The activation energy distribution is essentially uniform for some carbons, and has two or possibly more discrete stages, suggestive of at least two types of sites, each with its own uniform distribution. The pre-exponential factors of the reactions are determined directly from the experimental data, and are found not to be constant as assumed in earlier work, but correlated with the activation energy. The latter results empirically follow an exponential function, supporting some earlier statistical and experimental work. The activation energy distribution obtained in the present paper permits improved correlation of chemisorption data in comparison to earlier studies. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.
Resumo:
We prove that, once an algorithm of perfect simulation for a stationary and ergodic random field F taking values in S(Zd), S a bounded subset of R(n), is provided, the speed of convergence in the mean ergodic theorem occurs exponentially fast for F. Applications from (non-equilibrium) statistical mechanics and interacting particle systems are presented.
Resumo:
The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
Posttraumatic stress disorder (PTSD) is a prevalent, disabling anxiety disorder marked by behavioral and physiologic alterations which commonly follows a chronic course. Exposure to a traumatic event constitutes a necessary, but not sufficient, factor. There is evidence from twin studies supporting a significant genetic predisposition to PTSD. However, the precise genetic loci still remain unclear. The objective of the present study was to identify, in a case-control study, whether the brain-derived neurotrophic factor (BDNF) val66met polymorphism (rs6265), the dopamine transporter (DAT1) three prime untranslated region (3`UTR) variable number of tandem repeats (VNTR), and the serotonin transporter (5-HTTPRL) short/long variants are associated with the development of PTSD in a group of victims of urban violence. All polymorphisms were genotyped in 65 PTSD patients as well as in 34 victims of violence without PTSD and in a community control group (n = 335). We did not find a statistical significant difference between the BDNF val66met and 5-HTTPRL polymorphism and the traumatic phenotype. However, a statistical association was found between DAT1 3`UTR VNTR nine repeats and PTSD (OR = 1.82; 95% CI, 1.20-2.76). This preliminary result confirms previous reports supporting a susceptibility role for allele 9 and PTSD.
Resumo:
Purpose: To verify the influence of cavity access diameter on demineralized dentin removal in the ART approach. Methods: 40 non-carious human premolars were randomly divided into four groups. The occlusal surface was ground flat and the teeth were sectioned mesio-distally. The hemi-sections were reassembled and occlusal access preparations were carried out using ball-shaped diamonds. The resulting size of the occlusal opening was 1.0 mm, 1.4 mm, 1.6 mm and 1.8 mm for Groups A, B, C, and D, respectively. Standardized artificial carious lesions were created and demineralized dentin was excavated. After excavation, the cavities were analyzed using: (a) the tactile method, (b) caries-detection dye to stain demineralized dentin, as proposed by Smales & Fang, and (c) Demineralized Tissue Removal index, as proposed in this study. Statistical analysis was performed using Fisher, Spearman correlation coefficient, kappa, Kruskal-Wallis and Miller tests (P < 0.05). Results: The three methods of evaluation showed no significant difference between Groups A vs. B, and C vs. D, while statistically significant differences were observed between Groups A vs. C, A vs. D, B vs. C and B vs. D. Based on the results of this study, the size of occlusal access significantly affected the efficacy of demineralized tissue removal.
Resumo:
Applying programming techniques to detailed data for 406 rice farms in 21 villages, for 1997, produces inefficiency measures, which differ substantially from the results of simple yield and unit cost measures. For the Boro (dry) season, mean technical efficiency was efficiency was 56.2 per cent and 69.4 per cent, allocative efficiency was 81.3 per cent, cost efficiency was 56.2 per cent and scale efficiency 94.9 per cent. The Aman (wet) season results are similar, but a few points lower. Allocative inefficiency is due to overuse of labour, suggesting population pressure, and of fertiliser, where recommended rates may warrant revision. Second-stage regressions show that large families are more inefficient, whereas farmers with better access to input markets, and those who do less off-farm work, tend to be more efficient. The information on the sources of inter-farm performance differentials could be used by the extension agents to help inefficient farmers. There is little excuse for such sub-optimal use of survey data, which are often collected at substantial costs.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.