24 resultados para Robust Probabilistic Model, Dyslexic Users, Rewriting, Question-Answering
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
We study a probabilistic model of interacting spins indexed by elements of a finite subset of the d-dimensional integer lattice, da parts per thousand yen1. Conditions of time reversibility are examined. It is shown that the model equilibrium distribution converges to a limit distribution as the indexing set expands to the whole lattice. The occupied site percolation problem is solved for the limit distribution. Two models with similar dynamics are also discussed.
Resumo:
Abstract Background One goal of gene expression profiling is to identify signature genes that robustly distinguish different types or grades of tumors. Several tumor classifiers based on expression profiling have been proposed using microarray technique. Due to important differences in the probabilistic models of microarray and SAGE technologies, it is important to develop suitable techniques to select specific genes from SAGE measurements. Results A new framework to select specific genes that distinguish different biological states based on the analysis of SAGE data is proposed. The new framework applies the bolstered error for the identification of strong genes that separate the biological states in a feature space defined by the gene expression of a training set. Credibility intervals defined from a probabilistic model of SAGE measurements are used to identify the genes that distinguish the different states with more reliability among all gene groups selected by the strong genes method. A score taking into account the credibility and the bolstered error values in order to rank the groups of considered genes is proposed. Results obtained using SAGE data from gliomas are presented, thus corroborating the introduced methodology. Conclusion The model representing counting data, such as SAGE, provides additional statistical information that allows a more robust analysis. The additional statistical information provided by the probabilistic model is incorporated in the methodology described in the paper. The introduced method is suitable to identify signature genes that lead to a good separation of the biological states using SAGE and may be adapted for other counting methods such as Massive Parallel Signature Sequencing (MPSS) or the recent Sequencing-By-Synthesis (SBS) technique. Some of such genes identified by the proposed method may be useful to generate classifiers.
Resumo:
A detailed numerical simulation of ethanol turbulent spray combustion on a rounded jet flame is pre- sented in this article. The focus is to propose a robust mathematical model with relatively low complexity sub- models to reproduce the main characteristics of the cou- pling between both phases, such as the turbulence modulation, turbulent droplets dissipation, and evaporative cooling effect. A RANS turbulent model is implemented. Special features of the model include an Eulerian– Lagrangian procedure under a fully two-way coupling and a modified flame sheet model with a joint mixture fraction– enthalpy b -PDF. Reasonable agreement between measured and computed mean profiles of temperature of the gas phase and droplet size distributions is achieved. Deviations found between measured and predicted mean velocity profiles are attributed to the turbulent combustion modeling adopted
Resumo:
Objective: Asthma is the most common chronic disease in childhood and has been designated a public health problem due to the increase in its prevalence in recent decades, the amount of health service expenditure it absorbs and an absence of consensus about its etiology. The relationships among psychosocial factors and the occurrence, symptomatology, and severity of asthma have recently been considered. There is still controversy about the association between asthma and a child`s mental health, since the pathways through which this relationship is established are complex and not well researched. This study aims to investigate whether behavior problems are associated with the prevalence of asthma symptoms in a large urban center in Latin America. Methods: It is a cross-section study of 869 children between 6 and 12 years old, residents of Salvador, Brazil. The International Study of Allergy and Asthma in Childhood (ISAAC) instrument was used to evaluate prevalence of asthma symptoms. The Child Behavior Checklist (CBCL) was employed to evaluate behavioral problems. Results: 19.26% (n = 212) of the children presented symptoms of asthma. 35% were classified as having clinical behavioral problems. Poisson`s robust regression model demonstrated a statistically significant association between the presence of behavioral problems and asthma symptoms occurrence (PR: 1.43; 95% Cl: 1.10-1.85). Conclusion: These results suggest an association between behavioral problems and pediatric asthma, and support the inclusion of mental health care in the provision of services for asthma morbidity. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Financial markets can be viewed as a highly complex evolving system that is very sensitive to economic instabilities. The complex organization of the market can be represented in a suitable fashion in terms of complex networks, which can be constructed from stock prices such that each pair of stocks is connected by a weighted edge that encodes the distance between them. In this work, we propose an approach to analyze the topological and dynamic evolution of financial networks based on the stock correlation matrices. An entropy-related measurement is adopted to quantify the robustness of the evolving financial market organization. It is verified that the network topological organization suffers strong variation during financial instabilities and the networks in such periods become less robust. A statistical robust regression model is proposed to quantity the relationship between the network structure and resilience. The obtained coefficients of such model indicate that the average shortest path length is the measurement most related to network resilience coefficient. This result indicates that a collective behavior is observed between stocks during financial crisis. More specifically, stocks tend to synchronize their price evolution, leading to a high correlation between pair of stock prices, which contributes to the increase in distance between them and, consequently, decrease the network resilience. (C) 2012 American Institute of Physics. [doi:10.1063/1.3683467]
Resumo:
Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.
A Robust Structural PGN Model for Control of Cell-Cycle Progression Stabilized by Negative Feedbacks
Resumo:
The cell division cycle comprises a sequence of phenomena controlled by a stable and robust genetic network. We applied a probabilistic genetic network (PGN) to construct a hypothetical model with a dynamical behavior displaying the degree of robustness typical of the biological cell cycle. The structure of our PGN model was inspired in well-established biological facts such as the existence of integrator subsystems, negative and positive feedback loops, and redundant signaling pathways. Our model represents genes interactions as stochastic processes and presents strong robustness in the presence of moderate noise and parameters fluctuations. A recently published deterministic yeast cell-cycle model does not perform as well as our PGN model, even upon moderate noise conditions. In addition, self stimulatory mechanisms can give our PGN model the possibility of having a pacemaker activity similar to the observed in the oscillatory embryonic cell cycle.
Resumo:
This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.
Resumo:
This paper addresses the numerical solution of random crack propagation problems using the coupling boundary element method (BEM) and reliability algorithms. Crack propagation phenomenon is efficiently modelled using BEM, due to its mesh reduction features. The BEM model is based on the dual BEM formulation, in which singular and hyper-singular integral equations are adopted to construct the system of algebraic equations. Two reliability algorithms are coupled with BEM model. The first is the well known response surface method, in which local, adaptive polynomial approximations of the mechanical response are constructed in search of the design point. Different experiment designs and adaptive schemes are considered. The alternative approach direct coupling, in which the limit state function remains implicit and its gradients are calculated directly from the numerical mechanical response, is also considered. The performance of both coupling methods is compared in application to some crack propagation problems. The investigation shows that direct coupling scheme converged for all problems studied, irrespective of the problem nonlinearity. The computational cost of direct coupling has shown to be a fraction of the cost of response surface solutions, regardless of experiment design or adaptive scheme considered. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
In epidemiology, the basic reproduction number R-0 is usually defined as the average number of new infections caused by a single infective individual introduced into a completely susceptible population. According to this definition. R-0 is related to the initial stage of the spreading of a contagious disease. However, from epidemiological models based on ordinary differential equations (ODE), R-0 is commonly derived from a linear stability analysis and interpreted as a bifurcation parameter: typically, when R-0 >1, the contagious disease tends to persist in the population because the endemic stationary solution is asymptotically stable: when R-0 <1, the corresponding pathogen tends to naturally disappear because the disease-free stationary solution is asymptotically stable. Here we intend to answer the following question: Do these two different approaches for calculating R-0 give the same numerical values? In other words, is the number of secondary infections caused by a unique sick individual equal to the threshold obtained from stability analysis of steady states of ODE? For finding the answer, we use a susceptibleinfective-recovered (SIR) model described in terms of ODE and also in terms of a probabilistic cellular automaton (PCA), where each individual (corresponding to a cell of the PCA lattice) is connected to others by a random network favoring local contacts. The values of R-0 obtained from both approaches are compared, showing good agreement. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In general the term "Lagrangian coherent structure" (LCS) is used to make reference about structures whose properties are similar to a time-dependent analog of stable and unstable manifolds from a hyperbolic fixed point in Hamiltonian systems. Recently, the term LCS was used to describe a different type of structure, whose properties are similar to those of invariant tori in certain classes of two-dimensional incompressible flows. A new kind of LCS was obtained. It consists of barriers, called robust tori that block the trajectories in certain regions of the phase space. We used the Double-Gyre Flow system as the model. In this system, the robust tori play the role of a skeleton for the dynamics and block, horizontally, vortices that come from different parts of the phase space. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Cognitive dissonance is the stress that comes from holding two conflicting thoughts simultaneously in the mind, usually arising when people are asked to choose between two detrimental or two beneficial options. In view of the well-established role of emotions in decision making, here we investigate whether the conventional structural models used to represent the relationships among basic emotions, such as the Circumplex model of affect, can describe the emotions of cognitive dissonance as well. We presented a questionnaire to 34 anonymous participants, where each question described a decision to be made among two conflicting motivations and asked the participants to rate analogically the pleasantness and the intensity of the experienced emotion. We found that the results were compatible with the predictions of the Circumplex model for basic emotions. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
Background. Chronic allograft vasculopathy (CAV) is an important cause of graft loss. Considering the immune inflammatory events involved in the development of CAV, therapeutic approaches to target this process are of relevance. Human amniotic fluid derived stem cells (hAFSCs), a class of fetal, pluripotent stem cells with intermediate characteristics between embryonic and adult stem cells, display immunomodulatory properties. hAFSCs express mesenchymal and embryonic markers, show high proliferation rates; however, they do not induce tumor formation, and their use does not raise ethical issues. Thus, we sought to investigate the effect of hAFSC on CAV in a model of aorta transplantation. Methods. Orthotopic aorta transplantation was performed using Fisher (F344) rats as donors and Lewis rats as recipients. Rats were divided into three groups: syngeneic (SYNG), untreated F344 receiving aorta from F344 (n = 8); allogeneic (ALLO), Lewis rats receiving allogeneic aorta from F344 (n = 8); and ALLO + hAFSC, ALLO rats treated with hAFSC (10(6) cells; n = 8). Histological analysis and immunohistochemistry were performed 30 days posttransplantation. Results. The ALLO group developed a robust aortic neointimal formation (208.7 +/- 25.4 gm) accompanied by a significant high number of ED1(+) (4845 +/- 841 cells/mm(2)) and CD43(+) cells (4064 +/- 563 cells/mm(2)), and enhanced expression of a-smooth muscle actin in the neointima (25 +/- 6%). Treatment with hAFSC diminished neointimal thickness (180.7 +/- 23.7 mu m) and induced a significant decrease of ED1(+) (1100 +/- 276 cells/mm(2)), CD43(+) cells (1080 +/- 309 cells/mu m(2)), and alpha-smooth muscle actin expression 8 +/- 3% in the neointima. Conclusions. These preliminary results showed that hAFSC suppressed inflammation and myofibroblast migration to the intima, which may contribute to ameliorate vascular changes in CAV.
Resumo:
In this paper, we carry out robust modeling and influence diagnostics in Birnbaum-Saunders (BS) regression models. Specifically, we present some aspects related to BS and log-BS distributions and their generalizations from the Student-t distribution, and develop BS-t regression models, including maximum likelihood estimation based on the EM algorithm and diagnostic tools. In addition, we apply the obtained results to real data from insurance, which shows the uses of the proposed model. Copyright (c) 2011 John Wiley & Sons, Ltd.