77 resultados para Bose-Einstein condensation statistical model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We numerically study the dynamics of a discrete spring-block model introduced by Olami, Feder, and Christensen (OFC) to mimic earthquakes and investigate to what extent this simple model is able to reproduce the observed spatiotemporal clustering of seismicity. Following a recently proposed method to characterize such clustering by networks of recurrent events [J. Davidsen, P. Grassberger, and M. Paczuski, Geophys. Res. Lett. 33, L11304 (2006)], we find that for synthetic catalogs generated by the OFC model these networks have many nontrivial statistical properties. This includes characteristic degree distributions, very similar to what has been observed for real seismicity. There are, however, also significant differences between the OFC model and earthquake catalogs, indicating that this simple model is insufficient to account for certain aspects of the spatiotemporal clustering of seismicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the dynamics of the adoption of new products by agents with continuous opinions and discrete actions (CODA). The model is such that the refusal in adopting a new idea or product is increasingly weighted by neighbor agents as evidence against the product. Under these rules, we study the distribution of adoption times and the final proportion of adopters in the population. We compare the cases where initial adopters are clustered to the case where they are randomly scattered around the social network and investigate small world effects on the final proportion of adopters. The model predicts a fat tailed distribution for late adopters which is verified by empirical data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Literature presents a huge number of different simulations of gas-solid flows in risers applying two-fluid modeling. In spite of that, the related quantitative accuracy issue remains mostly untouched. This state of affairs seems to be mainly a consequence of modeling shortcomings, notably regarding the lack of realistic closures. In this article predictions from a two-fluid model are compared to other published two-fluid model predictions applying the same Closures, and to experimental data. A particular matter of concern is whether the predictions are generated or not inside the statistical steady state regime that characterizes the riser flows. The present simulation was performed inside the statistical steady state regime. Time-averaged results are presented for different time-averaging intervals of 5, 10, 15 and 20 s inside the statistical steady state regime. The independence of the averaged results regarding the time-averaging interval is addressed and the results averaged over the intervals of 10 and 20 s are compared to both experiment and other two-fluid predictions. It is concluded that the two-fluid model used is still very crude, and cannot provide quantitative accurate results, at least for the particular case that was considered. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work extends a previously presented refined sandwich beam finite element (FE) model to vibration analysis, including dynamic piezoelectric actuation and sensing. The mechanical model is a refinement of the classical sandwich theory (CST), for which the core is modelled with a third-order shear deformation theory (TSDT). The FE model is developed considering, through the beam length, electrically: constant voltage for piezoelectric layers and quadratic third-order variable of the electric potential in the core, while meclianically: linear axial displacement, quadratic bending rotation of the core and cubic transverse displacement of the sandwich beam. Despite the refinement of mechanical and electric behaviours of the piezoelectric core, the model leads to the same number of degrees of freedom as the previous CST one due to a two-step static condensation of the internal dof (bending rotation and core electric potential third-order variable). The results obtained with the proposed FE model are compared to available numerical, analytical and experimental ones. Results confirm that the TSDT and the induced cubic electric potential yield an extra stiffness to the sandwich beam. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim objective of this project was to evaluate the protein extraction of soybean flour in dairy whey, by the multivariate statistical method with 2(3) experiments. Influence of three variables were considered: temperature, pH and percentage of sodium chloride against the process specific variable ( percentage of protein extraction). It was observed that, during the protein extraction against time and temperature, the treatments at 80 degrees C for 2h presented great values of total protein (5.99%). The increasing for the percentage of protein extraction was major according to the heating time. Therefore, the maximum point from the function that represents the protein extraction was analysed by factorial experiment 2(3). By the results, it was noted that all the variables were important to extraction. After the statistical analyses, was observed that the parameters as pH, temperature, and percentage of sodium chloride, did not sufficient for the extraction process, since did not possible to obtain the inflection point from mathematical function, however, by the other hand, the mathematical model was significant, as well as, predictive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a sample of censored survival times, the presence of an immune proportion of individuals who are not subject to death, failure or relapse, may be indicated by a relatively high number of individuals with large censored survival times. In this paper the generalized log-gamma model is modified for the possibility that long-term survivors may be present in the data. The model attempts to separately estimate the effects of covariates on the surviving fraction, that is, the proportion of the population for which the event never occurs. The logistic function is used for the regression model of the surviving fraction. Inference for the model parameters is considered via maximum likelihood. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. Finally, a data set from the medical area is analyzed under the log-gamma generalized mixture model. A residual analysis is performed in order to select an appropriate model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correct modeling of root water uptake partitioning over depth is an important issue in hydrological and crop growth models. Recently a physically based model to describe root water uptake was developed at single root scale and upscaled to the root system scale considering a homogeneous distribution of roots per soil layer. Root water uptake partitioning is calculated over soil layers or compartments as a function of respective soil hydraulic conditions, specifically the soil matric flux potential, root characteristics and a root system efficiency factor to compensate for within-layer root system heterogeneities. The performance of this model was tested in an experiment performed in two-compartment split-pot lysimeters with sorghum plants. The compartments were submitted to different irrigation cycles resulting in contrasting water contents over time. The root system efficiency factor was determined to be about 0.05. Release of water from roots to soil was predicted and observed on several occasions during the experiment; however, model predictions suggested root water release to occur more often and at a higher rate than observed. This may be due to not considering internal root system resistances, thus overestimating the ease with which roots can act as conductors of water. Excluding these erroneous predictions from the dataset, statistical indices show model performance to be of good quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a kinetic Ising model which represents a generic agent-based model for various types of socio-economic systems. We study the case of a finite (and not necessarily large) number of agents N as well as the asymptotic case when the number of agents tends to infinity. The main ingredient are individual decision thresholds which are either fixed over time (corresponding to quenched disorder in the Ising model, leading to nonlinear deterministic dynamics which are generically non-ergodic) or which may change randomly over time (corresponding to annealed disorder, leading to ergodic dynamics). We address the question how increasing the strength of annealed disorder relative to quenched disorder drives the system from non-ergodic behavior to ergodicity. Mathematically rigorous analysis provides an explicit and detailed picture for arbitrary realizations of the quenched initial thresholds, revealing an intriguing ""jumpy"" transition from non-ergodicity with many absorbing sets to ergodicity. For large N we find a critical strength of annealed randomness, above which the system becomes asymptotically ergodic. Our theoretical results suggests how to drive a system from an undesired socio-economic equilibrium (e. g. high level of corruption) to a desirable one (low level of corruption).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Different hemodynamic parameters including static indicators of cardiac preload as right ventricular end-diastolic volume index (RVEDVI) and dynamic parameters as pulse pressure variation (PPV) have been used in the decision-making process regarding volume expansion in critically ill patients. The objective of this study was to compare fluid resuscitation guided by either PPV or RVEDVI after experimentally induced hemorrhagic shock. Methods: Twenty-six anesthetized and mechanically ventilated pigs were allocated into control (group I), PPV (group II), or RVEDVI (group III) group. Hemorrhagic shock was induced by blood withdrawal to target mean arterial pressure of 40 mm Hg, maintained for 60 minutes. Parameters were measured at baseline, time of shock, 60 minutes after shock, immediately after resuscitation with hydroxyethyl starch 6% (130/0.4), 1 hour and 2 hours thereafter. The endpoint of fluid resuscitation was determined as the baseline values of PPV and RVEDVI. Statistical analysis of data was based on analysis of variance for repeated measures followed by the Bonferroni test (p < 0.05). Results: Volume and time to resuscitation were higher in group III than in group II (group III = 1,305 +/- 331 mL and group II = 965 +/- 245 mL, p < 0.05; and group III = 24.8 +/- 4.7 minutes and group II = 8.8 +/- 1.3 minutes, p < 0.05, respectively). All static and dynamic parameters and biomarkers of tissue oxygenation were affected by hemorrhagic shock and nearly all parameters were restored after resuscitation in both groups. Conclusion: In the proposed model of hemorrhagic shock, resuscitation to the established endpoints was achieved within a smaller amount of time and with less volume when guided by PPV than when guided by pulmonary artery catheter-derived RVEDVI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence that combined glucosamine sulfate and chondroitin sulfate (Gluchon) or isolated glucosamine (Glu) modifies joint damage in osteoarthritis (OA) is still lacking. We studied joint pain and cartilage damage using the anterior cruciate ligament transection (ACLT) model. Wistar rats were subjected to ACLT of the right knee ( OA) or sham operation. Groups received either Glu (500 mg/kg), Gluchon (500 mg/kg glucosamine +400 mg/kg chondroitin) or vehicle (non-treated-NT) per os starting 7 days prior to ACLT until sacrifice at 70 days. Joint pain was evaluated daily using the rat-knee joint articular incapacitation test. Structural joint damage was assessed using histology and biochemistry as the chondroitin sulfate ( CS) content of cartilage by densitometry (microgram per milligram dried cartilage), comparing to standard CS. The molar weight (Mw) of the CS samples, used as a qualitative biochemical parameter, was obtained by comparing their relative mobility on a polyacrylamide gel electrophoresis to standard CS. Gluchon, but not Glu, significantly reduced joint pain (P<0.05) compared to NT. There was an increase in CS content in the OA group (77.7 +/- 8.3 mu g/mg) compared to sham (53.5 +/- 11.2 mu g/mg) (P<0.05). The CS from OA samples had higher Mw (4:62 +/- 0:24 x 10(4) g/mol) compared to sham (4:18 +/- 0:19 x 10(4) g/mol) (P<0.05). Gluchon administration significantly reversed both the increases in CS content (54.4 +/- 12.1 mu g/mg) and Mw (4:18 +/- 0:2 x 104 g/mol) as compared to NT. Isolated Glu decreased CS content though not reaching statistical significance. Cartilage histology alterations were also significantly prevented by Gluchon administration. Gluchon provides clinical (analgesia) and structural benefits in the ACLT model. This is the first demonstration that biochemical alterations occurring in parallel to histological damage in OA are prevented by Gluchon administration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the purpose of developing a longitudinal model to predict hand-and-foot syndrome (HFS) dynamics in patients receiving capecitabine, data from two large phase III studies were used. Of 595 patients in the capecitabine arms, 400 patients were randomly selected to build the model, and the other 195 were assigned for model validation. A score for risk of developing HFS was modeled using the proportional odds model, a sigmoidal maximum effect model driven by capecitabine accumulation as estimated through a kinetic-pharmacodynamic model and a Markov process. The lower the calculated creatinine clearance value at inclusion, the higher was the risk of HFS. Model validation was performed by visual and statistical predictive checks. The predictive dynamic model of HFS in patients receiving capecitabine allows the prediction of toxicity risk based on cumulative capecitabine dose and previous HFS grade. This dose-toxicity model will be useful in developing Bayesian individual treatment adaptations and may be of use in the clinic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The aim of this article is to propose an integrated framework for extracting and describing patterns of disorders from medical images using a combination of linear discriminant analysis and active contour models. Methods: A multivariate statistical methodology was first used to identify the most discriminating hyperplane separating two groups of images (from healthy controls and patients with schizophrenia) contained in the input data. After this, the present work makes explicit the differences found by the multivariate statistical method by subtracting the discriminant models of controls and patients, weighted by the pooled variance between the two groups. A variational level-set technique was used to segment clusters of these differences. We obtain a label of each anatomical change using the Talairach atlas. Results: In this work all the data was analysed simultaneously rather than assuming a priori regions of interest. As a consequence of this, by using active contour models, we were able to obtain regions of interest that were emergent from the data. The results were evaluated using, as gold standard, well-known facts about the neuroanatomical changes related to schizophrenia. Most of the items in the gold standard was covered in our result set. Conclusions: We argue that such investigation provides a suitable framework for characterising the high complexity of magnetic resonance images in schizophrenia as the results obtained indicate a high sensitivity rate with respect to the gold standard. (C) 2010 Elsevier B.V. All rights reserved.