40 resultados para Statistical Error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a statistical model to account for the gel-fluid anomalous phase transitions in charged bilayer- or lamellae-forming ionic lipids. The model Hamiltonian comprises effective attractive interactions to describe neutral-lipid membranes as well as the effect of electrostatic repulsions of the discrete ionic charges on the lipid headgroups. The latter can be counterion dissociated (charged) or counterion associated (neutral), while the lipid acyl chains may be in gel (low-temperature or high-lateral-pressure) or fluid (high-temperature or low-lateral-pressure) states. The system is modeled as a lattice gas with two distinct particle types-each one associated, respectively, with the polar-headgroup and the acyl-chain states-which can be mapped onto an Ashkin-Teller model with the inclusion of cubic terms. The model displays a rich thermodynamic behavior in terms of the chemical potential of counterions (related to added salt concentration) and lateral pressure. In particular, we show the existence of semidissociated thermodynamic phases related to the onset of charge order in the system. This type of order stems from spatially ordered counterion association to the lipid headgroups, in which charged and neutral lipids alternate in a checkerboard-like order. Within the mean-field approximation, we predict that the acyl-chain order-disorder transition is discontinuous, with the first-order line ending at a critical point, as in the neutral case. Moreover, the charge order gives rise to continuous transitions, with the associated second-order lines joining the aforementioned first-order line at critical end points. We explore the thermodynamic behavior of some physical quantities, like the specific heat at constant lateral pressure and the degree of ionization, associated with the fraction of charged lipid headgroups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a simple Maier-Saupe statistical model with the inclusion of disorder degrees of freedom to mimic the phase diagram of a mixture of rodlike and disklike molecules. A quenched distribution of shapes leads to a phase diagram with two uniaxial and a biaxial nematic structure. A thermalized distribution, however, which is more adequate to liquid mixtures, precludes the stability of this biaxial phase. We then use a two-temperature formalism, and assume a separation of relaxation times, to show that a partial degree of annealing is already sufficient to stabilize a biaxial nematic structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The parallel mutation-selection evolutionary dynamics, in which mutation and replication are independent events, is solved exactly in the case that the Malthusian fitnesses associated to the genomes are described by the random energy model (REM) and by a ferromagnetic version of the REM. The solution method uses the mapping of the evolutionary dynamics into a quantum Ising chain in a transverse field and the Suzuki-Trotter formalism to calculate the transition probabilities between configurations at different times. We find that in the case of the REM landscape the dynamics can exhibit three distinct regimes: pure diffusion or stasis for short times, depending on the fitness of the initial configuration, and a spin-glass regime for large times. The dynamic transition between these dynamical regimes is marked by discontinuities in the mean-fitness as well as in the overlap with the initial reference sequence. The relaxation to equilibrium is described by an inverse time decay. In the ferromagnetic REM, we find in addition to these three regimes, a ferromagnetic regime where the overlap and the mean-fitness are frozen. In this case, the system relaxes to equilibrium in a finite time. The relevance of our results to information processing aspects of evolution is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here, I investigate the use of Bayesian updating rules applied to modeling how social agents change their minds in the case of continuous opinion models. Given another agent statement about the continuous value of a variable, we will see that interesting dynamics emerge when an agent assigns a likelihood to that value that is a mixture of a Gaussian and a uniform distribution. This represents the idea that the other agent might have no idea about what is being talked about. The effect of updating only the first moments of the distribution will be studied, and we will see that this generates results similar to those of the bounded confidence models. On also updating the second moment, several different opinions always survive in the long run, as agents become more stubborn with time. However, depending on the probability of error and initial uncertainty, those opinions might be clustered around a central value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article intends to contribute to the reflection on the Educational Statistics as being source for the researches on History of Education. The main concern was to reveal the way Educational Statistics related to the period from 1871 to 1931 were produced, in central government. Official reports - from the General Statistics Directory - and Statistics yearbooks released by that department were analyzed and, on this analysis, recommendations and definitions to perform the works were sought. By rending problematic to the documental issues on Educational Statistics and their usual interpretations, the intention was to reduce the ignorance about the origin of the school numbers, which are occasionally used in current researches without the convenient critical exam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents the results of a mature landfill leachate treated by a homogeneous catalytic ozonation process with ions Fe(2+) and Fe(3+) at acidic pH. Quality assessments were performed using Taguchi`s method (L(8) design). Strong synergism was observed statistically between molecular ozone and ferric ions, pointing to their catalytic effect on (center dot)OH generation. The achievement of better organic matter depollution rates requires an ozone flow of 5 L h(-1) (590 mg h(-1) O(3)) and a ferric ion concentration of 5 mg L(-1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medication administration errors (MAE) are the most frequent kind of medication errors. Errors with antimicrobial drugs (AD) are relevant because they may interfere inpatient safety and in the development of microbial resistance. The aim of this study is to analyze the AD errors detected in a Brazilian multicentric study of MAE. It was a devcriptive and explorotory study carried out in clinical units in five Brazilian teaching hospitals. The hospitals were investigated during 30 days. MAE were detected by observation technique. MAE were classified in categories: wrong route(WR), wrong patient(WP), wrong dose(WD) wrong time (WT) and unordered drug (UD). AD with MA E were classified by Anatomical-Therapeutical-Chemical Classification System. AD with narrow therapeutic index (NTI) wet-e identified A descriptive statistical analysis was performed using SPSS version 11.5 software. A total of 1500 errors were observed, 277 (18.5%) of them were error with AD. The hopes of AD error were: WT87.7%, QD 6.9%, WR 1.5%, UD 3.2% and WP 0.7%. The number of AD found was 36. The mostly ATC class were fluoroquinolones 13.9%, combinations of penicillin 13.9%, macrolides 8.3% and third-generation cephalosporines 5.6%. The parenteral drug dosage form was associated with 55.6% of AD. 16.7% of AD were NTI. 47.4% of WD and 21.8% WT were with NTI drugs. This study shows that these errors should be considered potential areas for improvement in the medication process and patient safety plus there is requirement to develop rational drug use of AD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the innovation approach is used to estimate the measurement total error associated with power system state estimation. This is required because the power system equations are very much correlated with each other and as a consequence part of the measurements errors is masked. For that purpose an index, innovation index (II), which provides the quantity of new information a measurement contains is proposed. A critical measurement is the limit case of a measurement with low II, it has a zero II index and its error is totally masked. In other words, that measurement does not bring any innovation for the gross error test. Using the II of a measurement, the masked gross error by the state estimation is recovered; then the total gross error of that measurement is composed. Instead of the classical normalised measurement residual amplitude, the corresponding normalised composed measurement residual amplitude is used in the gross error detection and identification test, but with m degrees of freedom. The gross error processing turns out to be very simple to implement, requiring only few adaptations to the existing state estimation software. The IEEE-14 bus system is used to validate the proposed gross error detection and identification test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the relentless quest for improved performance driving ever tighter tolerances for manufacturing, machine tools are sometimes unable to meet the desired requirements. One option to improve the tolerances of machine tools is to compensate for their errors. Among all possible sources of machine tool error, thermally induced errors are, in general for newer machines, the most important. The present work demonstrates the evaluation and modelling of the behaviour of the thermal errors of a CNC cylindrical grinding machine during its warm-up period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a statistical study on the variability of the mechanical properties of hardened self-compacting concrete, including the compressive strength, splitting tensile strength and modulus of elasticity. The comparison of the experimental results with those derived from several codes and recommendations allows evaluating if the hardened behaviour of self-compacting concrete can be appropriately predicted by the existing formulations. The variables analyzed include the maximum size aggregate, paste and gravel content. Results from the analyzed self-compacting concretes presented variability measures in the same range than the expected for conventional vibrated concrete, with all the results within a confidence level of 95%. From several formulations for conventional concrete considered in this study, it was observed that a safe estimation of the modulus of elasticity can be obtained from the value of compressive strength; with lower strength self-compacting concretes presenting higher safety margins. However, most codes overestimate the material tensile strength. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a one-time signature scheme based on the hardness of the syndrome decoding problem, and prove it secure in the random oracle model. Our proposal can be instantiated on general linear error correcting codes, rather than restricted families like alternant codes for which a decoding trapdoor is known to exist. (C) 2010 Elsevier Inc. All rights reserved,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this article is to present a quantitative analysis of the human failure contribution in the collision and/or grounding of oil tankers, considering the recommendation of the ""Guidelines for Formal Safety Assessment"" of the International Maritime Organization. Initially, the employed methodology is presented, emphasizing the use of the technique for human error prediction to reach the desired objective. Later, this methodology is applied to a ship operating on the Brazilian coast and, thereafter, the procedure to isolate the human actions with the greatest potential to reduce the risk of an accident is described. Finally, the management and organizational factors presented in the ""International Safety Management Code"" are associated with these selected actions. Therefore, an operator will be able to decide where to work in order to obtain an effective reduction in the probability of accidents. Even though this study does not present a new methodology, it can be considered as a reference in the human reliability analysis for the maritime industry, which, in spite of having some guides for risk analysis, has few studies related to human reliability effectively applied to the sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim objective of this project was to evaluate the protein extraction of soybean flour in dairy whey, by the multivariate statistical method with 2(3) experiments. Influence of three variables were considered: temperature, pH and percentage of sodium chloride against the process specific variable ( percentage of protein extraction). It was observed that, during the protein extraction against time and temperature, the treatments at 80 degrees C for 2h presented great values of total protein (5.99%). The increasing for the percentage of protein extraction was major according to the heating time. Therefore, the maximum point from the function that represents the protein extraction was analysed by factorial experiment 2(3). By the results, it was noted that all the variables were important to extraction. After the statistical analyses, was observed that the parameters as pH, temperature, and percentage of sodium chloride, did not sufficient for the extraction process, since did not possible to obtain the inflection point from mathematical function, however, by the other hand, the mathematical model was significant, as well as, predictive.