930 resultados para dual-process model
Resumo:
We revisit the scaling properties of a model for nonequilibrium wetting [Phys. Rev. Lett. 79, 2710 (1997)], correcting previous estimates of the critical exponents and providing a complete scaling scheme. Moreover, we investigate a special point in the phase diagram, where the model exhibits a roughening transition related to directed percolation. We argue that in the vicinity of this point evaporation from the middle of plateaus can be interpreted as an external field in the language of directed percolation. This analogy allows us to compute the crossover exponent and to predict the form of the phase transition line close to its terminal point.
Resumo:
The nuclear gross theory, originally formulated by Takahashi and Yamada (1969 Prog. Theor. Phys. 41 1470) for the beta-decay, is applied to the electronic-neutrino nucleus reactions, employing a more realistic description of the energetics of the Gamow-Teller resonances. The model parameters are gauged from the most recent experimental data, both for beta(-)-decay and electron capture, separately for even-even, even-odd, odd-odd and odd-even nuclei. The numerical estimates for neutrino-nucleus cross-sections agree fairly well with previous evaluations done within the framework of microscopic models. The formalism presented here can be extended to the heavy nuclei mass region, where weak processes are quite relevant, which is of astrophysical interest because of its applications in supernova explosive nucleosynthesis.
Resumo:
In the Hammersley-Aldous-Diaconis process, infinitely many particles sit in R and at most one particle is allowed at each position. A particle at x, whose nearest neighbor to the right is at y, jumps at rate y - x to a position uniformly distributed in the interval (x, y). The basic coupling between trajectories with different initial configuration induces a process with different classes of particles. We show that the invariant measures for the two-class process can be obtained as follows. First, a stationary M/M/1 queue is constructed as a function of two homogeneous Poisson processes, the arrivals with rate, and the (attempted) services with rate rho > lambda Then put first class particles at the instants of departures (effective services) and second class particles at the instants of unused services. The procedure is generalized for the n-class case by using n - 1 queues in tandem with n - 1 priority types of customers. A multi-line process is introduced; it consists of a coupling (different from Liggett's basic coupling), having as invariant measure the product of Poisson processes. The definition of the multi-line process involves the dual points of the space-time Poisson process used in the graphical construction of the reversed process. The coupled process is a transformation of the multi-line process and its invariant measure is the transformation described above of the product measure.
Resumo:
We study a general stochastic rumour model in which an ignorant individual has a certain probability of becoming a stifler immediately upon hearing the rumour. We refer to this special kind of stifler as an uninterested individual. Our model also includes distinct rates for meetings between two spreaders in which both become stiflers or only one does, so that particular cases are the classical Daley-Kendall and Maki-Thompson models. We prove a Law of Large Numbers and a Central Limit Theorem for the proportions of those who ultimately remain ignorant and those who have heard the rumour but become uninterested in it.
Resumo:
Currently there is a trend for the expansion of the area cropped with sugarcane (Saccharum officinarum L.), driven by an increase in the world demand for biofuels, due to economical, environmental, and geopolitical issues. Although sugarcane is traditionally harvested by burning dried leaves and tops, the unburned, mechanized harvest has been progressively adopted. The use of process based models is useful in understanding the effects of plant litter in soil C dynamics. The objective of this work was to use the CENTURY model in evaluating the effect of sugarcane residue management in the temporal dynamics of soil C. The approach taken in this work was to parameterize the CENTURY model for the sugarcane crop, to simulate the temporal dynamics of soil C, validating the model through field experiment data, and finally to make predictions in the long term regarding soil C. The main focus of this work was the comparison of soil C stocks between the burned and unburned litter management systems, but the effect of mineral fertilizer and organic residue applications were also evaluated. The simulations were performed with data from experiments with different durations, from 1 to 60 yr, in Goiana and Timbauba, Pernambuco, and Pradopolis, Sao Paulo, all in Brazil; and Mount Edgecombe, Kwazulu-Natal, South Africa. It was possible to simulate the temporal dynamics of soil C (R(2) = 0.89). The predictions made with the model revealed that there is, in the long term, a trend for higher soil C stocks with the unburned management. This increase is conditioned by factors such as climate, soil texture, time of adoption of the unburned system, and N fertilizer management.
Resumo:
In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The concentration of hydrogen peroxide is an important parameter in the azo dyes decoloration process through the utilization of advanced oxidizing processes, particularly by oxidizing via UV/H2O2. It is pointed out that, from a specific concentration, the hydrogen peroxide works as a hydroxyl radical self-consumer and thus a decrease of the system`s oxidizing power happens. The determination of the process critical point (maximum amount of hydrogen peroxide to be added) was performed through a ""thorough mapping"" or discretization of the target region, founded on the maximization of an objective function objective (constant of reaction kinetics of pseudo-first order). The discretization of the operational region occurred through a feedforward backpropagation neural model. The neural model obtained presented remarkable coefficient of correlation between real and predicted values for the absorbance variable, above 0.98. In the present work, the neural model had, as phenomenological basis the Acid Brown 75 dye decoloration process. The hydrogen peroxide addition critical point, represented by a value of mass relation (F) between the hydrogen peroxide mass and the dye mass, was established in the interval 50 < F < 60. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Catalytic ozonation has been recognized in the scientific community as an efficient technique, reaching elevated rates of recalcitrant organic material mineralization, even at the presence of scavenger species of hydroxyl free radicals. This study presents the most significant factors involving the leachate treatment stabilized by the municipal landfill of the city of Guaratingueta, State of Sao Paulo, Brazil, by using a catalytic ozonation activated by metallic ions Fe(3+), Zn(2+), Mn(2+), Ni(2+) and Cr(3+). The Taguchi L(16) orthogonal array and its associated statistical methods were also used in this study. Among the researched ions, the most notable catalysis was obtained with ferric ion, statistically significant in the reduction of COD with a confidence level of 99.5%.
Resumo:
The aim of this study was to measure the temporal expression of osteogenic genes during the process of bone healing in low-intensity pulsed ultrasound (LIPUS) treated bone defects by means of histopathologic and real-time polymerase chain reaction (PCR) analysis. Animals were randomly distributed into two groups (n = 30): control group (bone defect without treatment) and LIPUS treated (bone defect treated with LIPUS). On days 7, 13 and 25 postinjury, 10 rats per group were sacrificed. Rats were treated with a 30 mW/cm(2) LIPUS. The results pointed out intense new bone formation surrounded by highly vascularized connective tissue presenting a slight osteogenic activity, with primary bone deposition was observed in the group exposed to LIPUS in the intermediary (13 days) and late stages of repair (25 days) in the treated animals. In addition, quantitative real-time polymerase chain reaction (RT-qPCR) showed an upregulation of bone morphogenetic protein 4 (BMP4), osteocalcin and Runx2 genes 7 days after the surgery. In the intermediary period, there was no increase in the expression. The expression of alkaline phosphatase, BMP4 and Runx2 was significantly increased at the last period. Our results indicate that LIPUS therapy improves bone repair in rats and upregulated osteogenic genes, mainly at the late stages of recovery. (E-mail: a.renno@unifesp.br) (C) 2010 Published by Elsevier Inc. on behalf of World Federation for Ultrasound in Medicine & Biology.
Resumo:
This paper discusses the integrated design of parallel manipulators, which exhibit varying dynamics. This characteristic affects the machine stability and performance. The design methodology consists of four main steps: (i) the system modeling using flexible multibody technique, (ii) the synthesis of reduced-order models suitable for control design, (iii) the systematic flexible model-based input signal design, and (iv) the evaluation of some possible machine designs. The novelty in this methodology is to take structural flexibilities into consideration during the input signal design; therefore, enhancing the standard design process which mainly considers rigid bodies dynamics. The potential of the proposed strategy is exploited for the design evaluation of a two degree-of-freedom high-speed parallel manipulator. The results are experimentally validated. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The machining of hardened steels has always been a great challenge in metal cutting, particularly for drilling operations. Generally, drilling is the machining process that is most difficult to cool due to the tool`s geometry. The aim of this work is to determine the heat flux and the coefficient of convection in drilling using the inverse heat conduction method. Temperature was assessed during the drilling of hardened AISI H13 steel using the embedded thermocouple technique. Dry machining and two cooling/lubrication systems were used, and thermocouples were fixed at distances very close to the hole`s wall. Tests were replicated for each condition, and were carried out with new and worn drills. An analytical heat conduction model was used to calculate the temperature at tool-workpiece interface and to define the heat flux and the coefficient of convection. In all tests using new and worn out drills, the lowest temperatures and decrease of heat flux were observed using the flooded system, followed by the MQL, considering the dry condition as reference. The decrease of temperature was directly proportional to the amount of lubricant applied and was significant in the MQL system when compared to dry cutting. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Product lifecycle management (PLM) innovates as it defines both the product as a central element to aggregate enterprise information and the lifecycle as a new time dimension for information integration and analysis. Because of its potential benefits to shorten innovation lead-times and to reduce costs, PLM has attracted a lot of attention at industry and at research. However, the current PLM implementation stage at most organisations still does not apply the lifecycle management concepts thoroughly. In order to close the existing realisation gap, this article presents a process oriented framework to support effective PLM implementation. The framework central point consists of a set of lifecycle oriented business process reference models which links the necessary fundamental concepts, enterprise knowledge and software solutions to effectively deploy PLM. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
This paper presents the proposal for a reference model for developing software aimed at small companies. Despite the importance of that represent the small software companies in Latin America, the fact of not having its own standards, and able to meet their specific, has created serious difficulties in improving their process and also in quality certification. In this sense and as a contribution to better understanding of the subject they propose a reference model and as a means to validate the proposal, presents a report of its application in a small Brazilian company, committed to certification of the quality model MPS.BR.
Resumo:
Fatigue and crack propagation are phenomena affected by high uncertainties, where deterministic methods fail to predict accurately the structural life. The present work aims at coupling reliability analysis with boundary element method. The latter has been recognized as an accurate and efficient numerical technique to deal with mixed mode propagation, which is very interesting for reliability analysis. The coupled procedure allows us to consider uncertainties during the crack growth process. In addition, it computes the probability of fatigue failure for complex structural geometry and loading. Two coupling procedures are considered: direct coupling of reliability and mechanical solvers and indirect coupling by the response surface method. Numerical applications show the performance of the proposed models in lifetime assessment under uncertainties, where the direct method has shown faster convergence than response surface method. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This work deals with analysis of cracked structures using BEM. Two formulations to analyse the crack growth process in quasi-brittle materials are discussed. They are based on the dual formulation of BEM where two different integral equations are employed along the opposite sides of the crack surface. The first presented formulation uses the concept of constant operator, in which the corrections of the nonlinear process are made only by applying appropriate tractions along the crack surfaces. The second presented BEM formulation to analyse crack growth problems is an implicit technique based on the use of a consistent tangent operator. This formulation is accurate, stable and always requires much less iterations to reach the equilibrium within a given load increment in comparison with the classical approach. Comparison examples of classical problem of crack growth are shown to illustrate the performance of the two formulations. (C) 2009 Elsevier Ltd. All rights reserved.