985 resultados para Generalized Lévy Process
Resumo:
The diffusive epidemic process (PED) is a nonequilibrium stochastic model which, exhibits a phase trnasition to an absorbing state. In the model, healthy (A) and sick (B) individuals diffuse on a lattice with diffusion constants DA and DB, respectively. According to a Wilson renormalization calculation, the system presents a first-order phase transition, for the case DA > DB. Several researches performed simulation works for test this is conjecture, but it was not possible to observe this first-order phase transition. The explanation given was that we needed to perform simulation to higher dimensions. In this work had the motivation to investigate the critical behavior of a diffusive epidemic propagation with Lévy interaction(PEDL), in one-dimension. The Lévy distribution has the interaction of diffusion of all sizes taking the one-dimensional system for a higher-dimensional. We try to explain this is controversy that remains unresolved, for the case DA > DB. For this work, we use the Monte Carlo Method with resuscitation. This is method is to add a sick individual in the system when the order parameter (sick density) go to zero. We apply a finite size scalling for estimates the critical point and the exponent critical =, e z, for the case DA > DB
Resumo:
For the configuration optimization of plate heat exchangers (PHEs), the mathematical models for heat transfer and pressure drop must be valid for a wide range of operational conditions of all configurations of the exchanger or the design results may be compromised. In this investigation, the thermal model of a PHE is adjusted to fit experimental data obtained from non-Newtonian heat transfer for eight different configurations, using carboxymethylcellulose solutions (CMC) as test fluid. Although it is possible to successfully adjust the model parameters, Newtonian and non-Newtonian heat transfer cannot be represented by a single generalized correlation. In addition, the specific heat, thermal conductivity and power-law rheological parameters of CMC solutions were correlated with temperature, over a range compatible with a continuous pasteurization process.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper addresses the investment decisions considering the presence of financial constraints of 373 large Brazilian firms from 1997 to 2004, using panel data. A Bayesian econometric model was used considering ridge regression for multicollinearity problems among the variables in the model. Prior distributions are assumed for the parameters, classifying the model into random or fixed effects. We used a Bayesian approach to estimate the parameters, considering normal and Student t distributions for the error and assumed that the initial values for the lagged dependent variable are not fixed, but generated by a random process. The recursive predictive density criterion was used for model comparisons. Twenty models were tested and the results indicated that multicollinearity does influence the value of the estimated parameters. Controlling for capital intensity, financial constraints are found to be more important for capital-intensive firms, probably due to their lower profitability indexes, higher fixed costs and higher degree of property diversification.
Resumo:
Fundamental principles of mechanics were primarily conceived for constant mass systems. Since the pioneering works of Meshcherskii (see historical review in Mikhailov (Mech. Solids 10(5):32-40, 1975), efforts have been made in order to elaborate an adequate mathematical formalism for variable mass systems. This is a current research field in theoretical mechanics. In this paper, attention is focused on the derivation of the so-called 'generalized canonical equations of Hamilton' for a variable mass particle. The applied technique consists in the consideration of the mass variation process as a dissipative phenomenon. Kozlov's (Stek. Inst. Math 223:178-184, 1998) method, originally devoted to the derivation of the generalized canonical equations of Hamilton for dissipative systems, is accordingly extended to the scenario of variable mass systems. This is done by conveniently writing the flux of kinetic energy from or into the variable mass particle as a 'Rayleigh-like dissipation function'. Cayley (Proc. R Soc. Lond. 8:506-511, 1857) was the first scholar to propose such an analogy. A deeper discussion on this particular subject will be left for a future paper.
Resumo:
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]
Resumo:
Generalized linear mixed models with semiparametric random effects are useful in a wide variety of Bayesian applications. When the random effects arise from a mixture of Dirichlet process (MDP) model, normal base measures and Gibbs sampling procedures based on the Pólya urn scheme are often used to simulate posterior draws. These algorithms are applicable in the conjugate case when (for a normal base measure) the likelihood is normal. In the non-conjugate case, the algorithms proposed by MacEachern and Müller (1998) and Neal (2000) are often applied to generate posterior samples. Some common problems associated with simulation algorithms for non-conjugate MDP models include convergence and mixing difficulties. This paper proposes an algorithm based on the Pólya urn scheme that extends the Gibbs sampling algorithms to non-conjugate models with normal base measures and exponential family likelihoods. The algorithm proceeds by making Laplace approximations to the likelihood function, thereby reducing the procedure to that of conjugate normal MDP models. To ensure the validity of the stationary distribution in the non-conjugate case, the proposals are accepted or rejected by a Metropolis-Hastings step. In the special case where the data are normally distributed, the algorithm is identical to the Gibbs sampler.
Resumo:
The current state of health and biomedicine includes an enormity of heterogeneous data ‘silos’, collected for different purposes and represented differently, that are presently impossible to share or analyze in toto. The greatest challenge for large-scale and meaningful analyses of health-related data is to achieve a uniform data representation for data extracted from heterogeneous source representations. Based upon an analysis and categorization of heterogeneities, a process for achieving comparable data content by using a uniform terminological representation is developed. This process addresses the types of representational heterogeneities that commonly arise in healthcare data integration problems. Specifically, this process uses a reference terminology, and associated "maps" to transform heterogeneous data to a standard representation for comparability and secondary use. The capture of quality and precision of the “maps” between local terms and reference terminology concepts enhances the meaning of the aggregated data, empowering end users with better-informed queries for subsequent analyses. A data integration case study in the domain of pediatric asthma illustrates the development and use of a reference terminology for creating comparable data from heterogeneous source representations. The contribution of this research is a generalized process for the integration of data from heterogeneous source representations, and this process can be applied and extended to other problems where heterogeneous data needs to be merged.
Resumo:
This article provides importance sampling algorithms for computing the probabilities of various types ruin of spectrally negative Lévy risk processes, which are ruin over the infinite time horizon, ruin within a finite time horizon and ruin past a finite time horizon. For the special case of the compound Poisson process perturbed by diffusion, algorithms for computing probabilities of ruins by creeping (i.e. induced by the diffusion term) and by jumping (i.e. by a claim amount) are provided. It is shown that these algorithms have either bounded relative error or logarithmic efficiency, as t,x→∞t,x→∞, where t>0t>0 is the time horizon and x>0x>0 is the starting point of the risk process, with y=t/xy=t/x held constant and assumed either below or above a certain constant.
Resumo:
This article provides an importance sampling algorithm for computing the probability of ruin with recuperation of a spectrally negative Lévy risk process with light-tailed downwards jumps. Ruin with recuperation corresponds to the following double passage event: for some t∈(0,∞)t∈(0,∞), the risk process starting at level x∈[0,∞)x∈[0,∞) falls below the null level during the period [0,t][0,t] and returns above the null level at the end of the period tt. The proposed Monte Carlo estimator is logarithmic efficient, as t,x→∞t,x→∞, when y=t/xy=t/x is constant and below a certain bound.
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^
Resumo:
This multidisciplinary study concerns the optimal design of processes with a view to both maximizing profit and minimizing environmental impacts. This can be achieved by a combination of traditional chemical process design methods, measurements of environmental impacts and advanced mathematical optimization techniques. More to the point, this paper presents a hybrid simulation-multiobjective optimization approach that at once optimizes the production cost and minimizes the associated environmental impacts of isobutane alkylation. This approach has also made it possible to obtain the flowsheet configurations and process variables that are needed to manufacture isooctane in a way that satisfies the above-stated double aim. The problem is formulated as a Generalized Disjunctive Programming problem and solved using state-of-the-art logic-based algorithms. It is shown, starting from existing alternatives for the process, that it is possible to systematically generate a superstructure that includes alternatives not previously considered. The optimal solution, in the form a Pareto curve, includes different structural alternatives from which the most suitable design can be selected. To evaluate the environmental impact, Life Cycle Assessment based on two different indicators is employed: Ecoindicator 99 and Global Warming Potential.
Resumo:
This paper introduces a new mathematical model for the simultaneous synthesis of heat exchanger networks (HENs), wherein the handling pressure of process streams is used to enhance the heat integration. The proposed approach combines generalized disjunctive programming (GDP) and mixed-integer nonlinear programming (MINLP) formulation, in order to minimize the total annualized cost composed by operational and capital expenses. A multi-stage superstructure is developed for the HEN synthesis, assuming constant heat capacity flow rates and isothermal mixing, and allowing for streams splits. In this model, the pressure and temperature of streams must be treated as optimization variables, increasing further the complexity and difficulty to solve the problem. In addition, the model allows for coupling of compressors and turbines to save energy. A case study is performed to verify the accuracy of the proposed model. In this example, the optimal integration between the heat and work decreases the need for thermal utilities in the HEN design. As a result, the total annualized cost is also reduced due to the decrease in the operational expenses related to the heating and cooling of the streams.
Resumo:
Multiobjective Generalized Disjunctive Programming (MO-GDP) optimization has been used for the synthesis of an important industrial process, isobutane alkylation. The two objective functions to be simultaneously optimized are the environmental impact, determined by means of LCA (Life Cycle Assessment), and the economic potential of the process. The main reason for including the minimization of the environmental impact in the optimization process is the widespread environmental concern by the general public. For the resolution of the problem we employed a hybrid simulation- optimization methodology, i.e., the superstructure of the process was developed directly in a chemical process simulator connected to a state of the art optimizer. The model was formulated as a GDP and solved using a logic algorithm that avoids the reformulation as MINLP -Mixed Integer Non Linear Programming-. Our research gave us Pareto curves compounded by three different configurations where the LCA has been assessed by two different parameters: global warming potential and ecoindicator-99.
Resumo:
This paper presents a new mathematical programming model for the retrofit of heat exchanger networks (HENs), wherein the pressure recovery of process streams is conducted to enhance heat integration. Particularly applied to cryogenic processes, HENs retrofit with combined heat and work integration is mainly aimed at reducing the use of expensive cold services. The proposed multi-stage superstructure allows the increment of the existing heat transfer area, as well as the use of new equipment for both heat exchange and pressure manipulation. The pressure recovery of streams is carried out simultaneously with the HEN design, such that the process conditions (streams pressure and temperature) are variables of optimization. The mathematical model is formulated using generalized disjunctive programming (GDP) and is optimized via mixed-integer nonlinear programming (MINLP), through the minimization of the retrofit total annualized cost, considering the turbine and compressor coupling with a helper motor. Three case studies are performed to assess the accuracy of the developed approach, including a real industrial example related to liquefied natural gas (LNG) production. The results show that the pressure recovery of streams is efficient for energy savings and, consequently, for decreasing the HEN retrofit total cost especially in sub-ambient processes.