72 resultados para statistical methodology
Resumo:
We consider a simple Maier-Saupe statistical model with the inclusion of disorder degrees of freedom to mimic the phase diagram of a mixture of rodlike and disklike molecules. A quenched distribution of shapes leads to a phase diagram with two uniaxial and a biaxial nematic structure. A thermalized distribution, however, which is more adequate to liquid mixtures, precludes the stability of this biaxial phase. We then use a two-temperature formalism, and assume a separation of relaxation times, to show that a partial degree of annealing is already sufficient to stabilize a biaxial nematic structure.
Resumo:
The PHENIX experiment has measured the suppression of semi-inclusive single high-transverse-momentum pi(0)'s in Au+Au collisions at root s(NN) = 200 GeV. The present understanding of this suppression is in terms of energy loss of the parent (fragmenting) parton in a dense color-charge medium. We have performed a quantitative comparison between various parton energy-loss models and our experimental data. The statistical point-to-point uncorrelated as well as correlated systematic uncertainties are taken into account in the comparison. We detail this methodology and the resulting constraint on the model parameters, such as the initial color-charge density dN(g)/dy, the medium transport coefficient <(q) over cap >, or the initial energy-loss parameter epsilon(0). We find that high-transverse-momentum pi(0) suppression in Au+Au collisions has sufficient precision to constrain these model-dependent parameters at the +/- 20-25% (one standard deviation) level. These constraints include only the experimental uncertainties, and further studies are needed to compute the corresponding theoretical uncertainties.
Resumo:
Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.
Resumo:
Today several different unsupervised classification algorithms are commonly used to cluster similar patterns in a data set based only on its statistical properties. Specially in image data applications, self-organizing methods for unsupervised classification have been successfully applied for clustering pixels or group of pixels in order to perform segmentation tasks. The first important contribution of this paper refers to the development of a self-organizing method for data classification, named Enhanced Independent Component Analysis Mixture Model (EICAMM), which was built by proposing some modifications in the Independent Component Analysis Mixture Model (ICAMM). Such improvements were proposed by considering some of the model limitations as well as by analyzing how it should be improved in order to become more efficient. Moreover, a pre-processing methodology was also proposed, which is based on combining the Sparse Code Shrinkage (SCS) for image denoising and the Sobel edge detector. In the experiments of this work, the EICAMM and other self-organizing models were applied for segmenting images in their original and pre-processed versions. A comparative analysis showed satisfactory and competitive image segmentation results obtained by the proposals presented herein. (C) 2008 Published by Elsevier B.V.
Resumo:
This article intends to contribute to the reflection on the Educational Statistics as being source for the researches on History of Education. The main concern was to reveal the way Educational Statistics related to the period from 1871 to 1931 were produced, in central government. Official reports - from the General Statistics Directory - and Statistics yearbooks released by that department were analyzed and, on this analysis, recommendations and definitions to perform the works were sought. By rending problematic to the documental issues on Educational Statistics and their usual interpretations, the intention was to reduce the ignorance about the origin of the school numbers, which are occasionally used in current researches without the convenient critical exam.
Resumo:
This paper aims to study evolution of increase, distribution and classification of pits in 310S austenitic stainless steels obtained in the state as-received and heat-treated under different exposure times in saline. This work applicability has been based on a technique development for morphologic characterization of localized corrosion associated with description aspects of shapes, size and population-specific parameters. Methodology has been consisted in the following steps: specimens preparation, corrosion tests via salt spray in different conditions, microstructural analysis, pits profiles analysis and images analysis, digital processing and image analysis in order to characterize the pits distribution, morphology and size. Results obtained in digital processing and profiles image analysis have been subjected to statistical analysis using median as parameter in the alloy as received and treated. The alloy as received displays the following morphology: hemispheric pits> transition region A> transition region B> irregular> conic. The pits amount in the treated alloy at each exposure time is: transition region B> hemispherical> transition region A> conic> irregular.
Resumo:
This study presents the results of a mature landfill leachate treated by a homogeneous catalytic ozonation process with ions Fe(2+) and Fe(3+) at acidic pH. Quality assessments were performed using Taguchi`s method (L(8) design). Strong synergism was observed statistically between molecular ozone and ferric ions, pointing to their catalytic effect on (center dot)OH generation. The achievement of better organic matter depollution rates requires an ozone flow of 5 L h(-1) (590 mg h(-1) O(3)) and a ferric ion concentration of 5 mg L(-1).
Resumo:
This work presents a thermoeconomic optimization methodology for the analysis and design of energy systems. This methodology involves economic aspects related to the exergy conception, in order to develop a tool to assist the equipment selection, operation mode choice as well as to optimize the thermal plants design. It also presents the concepts related to exergy in a general scope and in thermoeconomics which combines the thermal sciences principles (thermodynamics, heat transfer, and fluid mechanics) and the economic engineering in order to rationalize energy systems investment decisions, development and operation. Even in this paper, it develops a thermoeconomic methodology through the use of a simple mathematical model, involving thermodynamics parameters and costs evaluation, also defining the objective function as the exergetic production cost. The optimization problem evaluation is developed for two energy systems. First is applied to a steam compression refrigeration system and then to a cogeneration system using backpressure steam turbine. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
BACKGROUND: The combined effects of vanillin and syringaldehyde on xylitol production by Candida guilliermondii using response surface methodology (RSM) have been studied. A 2(2) full-factorial central composite design was employed for experimental design and analysis of the results. RESULTS: Maximum xylitol productivities (Q(p) = 0.74 g L(-1) h(-1)) and yields (Y(P/S) = 0.81 g g(-1)) can be attained by adding only vanillin at 2.0 g L(-1) to the fermentation medium. These data were closely correlated with the experimental results obtained (0.69 +/- 0.04 g L(-1) h(-1) and 0.77 +/- 0.01 g g(-1)) indicating a good agreement with the predicted value. C. guilliermondii was able to convert vanillin completely after 24 h of fermentation with 94% yield of vanillyl alcohol. CONCLUSIONS: The bioconversion of xylose into xylitol by C. guilliermondii is strongly dependent on the combination of aldehydes and phenolics in the fermentation medium. Vanillin is a source of phenolic compound able to improve xylitol production by yeast. The conversion of vanillin to alcohol vanilyl reveals the potential of this yeast for medium detoxification. (C) 2009 Society of Chemical Industry
Resumo:
Response surface methodology was used to evaluate optimal time, temperature and oxalic acid concentration for simultaneous saccharification and fermentation (SSF) of corncob particles by Pichia stipitis CBS 6054. Fifteen different conditions for pretreatment were examined in a 2(3) full factorial design with six axial points. Temperatures ranged from 132 to 180 degrees C, time from 10 to 90 min and oxalic acid loadings from 0.01 to 0.038 g/g solids. Separate maxima were found for enzymatic saccharification and hemicellulose fermentation, respectively, with the condition for maximum saccharification being significantly more severe. Ethanol production was affected by reaction temperature more than by oxalic acid and reaction time over the ranges examined. The effect of reaction temperature was significant at a 95% confidence level in its effect on ethanol production. Oxalic acid and reaction time were statistically significant at the 90% level. The highest ethanol concentration (20 g/l) was obtained after 48 h with an ethanol volumetric production rate of 0.42 g ethanol l(-1) h(-1). The ethanol yield after SSF with P. stipitis was significantly higher than predicted by sequential saccharification and fermentation of substrate pretreated under the same condition. This was attributed to the secretion of beta-glucosidase by P. stipitis. During SSF, free extracellular beta-glucosidase activity was 1.30 pNPG U/g with P. stipitis, while saccharification without the yeast was 0.66 pNPG U/g. Published by Elsevier Ltd.
Resumo:
A hybrid system to automatically detect, locate and classify disturbances affecting power quality in an electrical power system is presented in this paper. The disturbances characterized are events from an actual power distribution system simulated by the ATP (Alternative Transients Program) software. The hybrid approach introduced consists of two stages. In the first stage, the wavelet transform (WT) is used to detect disturbances in the system and to locate the time of their occurrence. When such an event is flagged, the second stage is triggered and various artificial neural networks (ANNs) are applied to classify the data measured during the disturbance(s). A computational logic using WTs and ANNs together with a graphical user interface (GU) between the algorithm and its end user is then implemented. The results obtained so far are promising and suggest that this approach could lead to a useful application in an actual distribution system. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Despite modern weed control practices, weeds continue to be a threat to agricultural production. Considering the variability of weeds, a classification methodology for the risk of infestation in agricultural zones using fuzzy logic is proposed. The inputs for the classification are attributes extracted from estimated maps for weed seed production and weed coverage using kriging and map analysis and from the percentage of surface infested by grass weeds, in order to account for the presence of weed species with a high rate of development and proliferation. The output for the classification predicts the risk of infestation of regions of the field for the next crop. The risk classification methodology described in this paper integrates analysis techniques which may help to reduce costs and improve weed control practices. Results for the risk classification of the infestation in a maize crop field are presented. To illustrate the effectiveness of the proposed system, the risk of infestation over the entire field is checked against the yield loss map estimated by kriging and also with the average yield loss estimated from a hyperbolic model.
Resumo:
This work presents a statistical study on the variability of the mechanical properties of hardened self-compacting concrete, including the compressive strength, splitting tensile strength and modulus of elasticity. The comparison of the experimental results with those derived from several codes and recommendations allows evaluating if the hardened behaviour of self-compacting concrete can be appropriately predicted by the existing formulations. The variables analyzed include the maximum size aggregate, paste and gravel content. Results from the analyzed self-compacting concretes presented variability measures in the same range than the expected for conventional vibrated concrete, with all the results within a confidence level of 95%. From several formulations for conventional concrete considered in this study, it was observed that a safe estimation of the modulus of elasticity can be obtained from the value of compressive strength; with lower strength self-compacting concretes presenting higher safety margins. However, most codes overestimate the material tensile strength. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The importance of a careful selection of rocks used in building facade cladding is highlighted. A simple and viable methodology for the structural detailing of dimension stones and the verification of the global performance is presented based on a Strap software simulation. The results obtained proved the applicability of the proposed structural dimensioning methodology which represents an excellent simple tool for dimensioning rock slabs used for building facade cladding. The Strap software satisfactorily simulated the structural conditions of the stone slabs under the studied conditions, allowing the determination of alternative slab dimensions and the verification of the cladding strength at the support.
Resumo:
In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.