916 resultados para Long memory stochastic process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models, as instruments for understanding the workings of nature, are a traditional tool of physics, but they also play an ever increasing role in biology - in the description of fundamental processes as well as that of complex systems. In this review, the authors discuss two examples of the application of group theoretical methods, which constitute the mathematical discipline for a quantitative description of the idea of symmetry, to genetics. The first one appears, in the form of a pseudo-orthogonal (Lorentz like) symmetry, in the stochastic modelling of what may be regarded as the simplest possible example of a genetic network and, hopefully, a building block for more complicated ones: a single self-interacting or externally regulated gene with only two possible states: ` on` and ` off`. The second is the algebraic approach to the evolution of the genetic code, according to which the current code results from a dynamical symmetry breaking process, starting out from an initial state of complete symmetry and ending in the presently observed final state of low symmetry. In both cases, symmetry plays a decisive role: in the first, it is a characteristic feature of the dynamics of the gene switch and its decay to equilibrium, whereas in the second, it provides the guidelines for the evolution of the coding rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this report, we describe a rapid and reliable process to bond channels fabricated in glass substrates. Glass channels were fabricated by photolithography and wet chemical etching. The resulting channels were bonded against another glass plate containing a 50-mu m thick PDMS layer. This same PDMS layer was also used to provide the electrical insulation of planar electrodes to carry out capacitively coupled contactless conductivity detection. The analytical performance of the proposed device was shown by using both LIF and capacitively coupled contactless conductivity detection systems. Efficiency around 47 000 plates/m was achieved with good chip-to-chip repeatability and satisfactory long-term stability of EOF. The RSD for the EOF measured in three different devices was ca. 7%. For a chip-to-chip comparison, the RSD values for migration time, electrophoretic current and peak area were below 10%. With the proposed approach, a single chip can be fabricated in less than 30 min including patterning, etching and sealing steps. This fabrication process is faster and easier than the thermal bonding process. Besides, the proposed method does not require high temperatures and provides excellent day-to-day and device-to-device repeatability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the cost efficiency in achieving the Swedish national air quality objectives under uncertainty. To realize an ecologically sustainable society, the parliament has approved a set of interim and long-term pollution reduction targets. However, there are considerable quantification uncertainties on the effectiveness of the proposed pollution reduction measures. In this paper, we develop a multivariate stochastic control framework to deal with the cost efficiency problem with multiple pollutants. Based on the cost and technological data collected by several national authorities, we explore the implications of alternative probabilistic constraints. It is found that a composite probabilistic constraint induces considerably lower abatement cost than separable probabilistic restrictions. The trend is reinforced by the presence of positive correlations between reductions in the multiple pollutants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The value of a comparative study of the two conflicts stems from a remarkable similarity in the structural organization of political violence by its most influential practitioners: the IRA and Hamas. At the core, I have merely tried my best to approach a beguiling question in a fresh, dynamic way. The stultifying discourse of conflict that serves as lingua franca for the Israeli‐Palestinian issue has largely reduced strategic debate to how best the conflict can be managed – not ended. Prime Minister Benjamin Netanyahu’s focus on “economic peace” and unwillingness to commit to a two‐state solution – the consensus that has governed peacemaking for decades – belies such thinking. The Clinton Administration’s cadre of Mideast negotiators operated amidst the most rapid institutionalization of Palestinian democracy in history ‐ yet remained obsessed with Israeli‐Arab “confidence‐building” measures, doing little to legitimize the gains of Oslo. So long as Palestinians continue to view the creation of Israel as “al‐Nakba” – the catastrophe – whilst successive Israeli governments refuse to grant their aspirations any legitimacy, there can be no progress. Peace requires empathy, a substantial compromise in the context of internecine conflict. The “long war” both conflicts have become mandates an equally expansive, broad‐based and labor‐intensive approach – a demanding process that can only be called The Long Game.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The regimen of environmental flows (EF) must be included as terms of environmental demand in the management of water resources. Even though there are numerous methods for the computation of EF, the criteria applied at different steps in the calculation process are quite subjective whereas the results are fixed values that must be meet by water planners. This study presents a friendly-user tool for the assessment of the probability of compliance of a certain EF scenario with the natural regimen in a semiarid area in southern Spain. 250 replications of a 25-yr period of different hydrological variables (rainfall, minimum and maximum flows, ...) were obtained at the study site from the combination of Monte Carlo technique and local hydrological relationships. Several assumptions are made such as the independence of annual rainfall from year to year and the variability of occurrence of the meteorological agents, mainly precipitation as the main source of uncertainty. Inputs to the tool are easily selected from a first menu and comprise measured rainfall data, EF values and the hydrological relationships for at least a 20-yr period. The outputs are the probabilities of compliance of the different components of the EF for the study period. From this, local optimization can be applied to establish EF components with a certain level of compliance in the study period. Different options for graphic output and analysis of results are included in terms of graphs and tables in several formats. This methodology turned out to be a useful tool for the implementation of an uncertainty analysis within the scope of environmental flows in water management and allowed the simulation of the impacts of several water resource development scenarios in the study site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model Predictive Control (MPC) is a control method that solves in real time an optimal control problem over a finite horizon. The finiteness of the horizon is both the reason of MPC's success and its main limitation. In operational water resources management, MPC has been in fact successfully employed for controlling systems with a relatively short memory, such as canals, where the horizon length is not an issue. For reservoirs, which have generally a longer memory, MPC applications are presently limited to short term management only. Short term reservoir management can be effectively used to deal with fast process, such as floods, but it is not capable of looking sufficiently ahead to handle long term issues, such as drought. To overcome this limitation, we propose an Infinite Horizon MPC (IH-MPC) solution that is particularly suitable for reservoir management. We propose to structure the input signal by use of orthogonal basis functions, therefore reducing the optimization argument to a finite number of variables, and making the control problem solvable in a reasonable time. We applied this solution for the management of the Manantali Reservoir. Manantali is a yearly reservoir located in Mali, on the Senegal river, affecting water systems of Mali, Senegal, and Mauritania. The long term horizon offered by IH-MPC is necessary to deal with the strongly seasonal climate of the region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presented work deals with the calibration of a 2D numerical model for the simulation of long term bed load transport. A settled basin along an alpine stream was used as a case study. The focus is to parameterise the used multi fractional transport model such that a dynamically balanced behavior regarding erosion and deposition is reached. The used 2D hydrodynamic model utilizes a multi-fraction multi-layer approach to simulate morphological changes and bed load transport. The mass balancing is performed between three layers: a top mixing layer, an intermediate subsurface layer and a bottom layer. Using this approach bears computational limitations in calibration. Due to the high computational demands, the type of calibration strategy is not only crucial for the result, but as well for the time required for calibration. Brute force methods such as Monte Carlo type methods may require a too large number of model runs. All here tested calibration strategies used multiple model runs utilising the parameterization and/or results from previous run. One concept was to reset to initial bed elevations after each run, allowing the resorting process to convert to stable conditions. As an alternative or in combination, the roughness was adapted, based on resulting nodal grading curves, from the previous run. Since the adaptations are a spatial process, the whole model domain is subdivided in homogeneous sections regarding hydraulics and morphological behaviour. For a faster optimization, the adaptation of the parameters is made section wise. Additionally, a systematic variation was done, considering results from previous runs and the interaction between sections. The used approach can be considered as similar to evolutionary type calibration approaches, but using analytical links instead of random parameter changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Paper Tackles the Problem of Aggregate Tfp Measurement Using Stochastic Frontier Analysis (Sfa). Data From Penn World Table 6.1 are Used to Estimate a World Production Frontier For a Sample of 75 Countries Over a Long Period (1950-2000) Taking Advantage of the Model Offered By Battese and Coelli (1992). We Also Apply the Decomposition of Tfp Suggested By Bauer (1990) and Kumbhakar (2000) to a Smaller Sample of 36 Countries Over the Period 1970-2000 in Order to Evaluate the Effects of Changes in Efficiency (Technical and Allocative), Scale Effects and Technical Change. This Allows Us to Analyze the Role of Productivity and Its Components in Economic Growth of Developed and Developing Nations in Addition to the Importance of Factor Accumulation. Although not Much Explored in the Study of Economic Growth, Frontier Techniques Seem to Be of Particular Interest For That Purpose Since the Separation of Efficiency Effects and Technical Change Has a Direct Interpretation in Terms of the Catch-Up Debate. The Estimated Technical Efficiency Scores Reveal the Efficiency of Nations in the Production of Non Tradable Goods Since the Gdp Series Used is Ppp-Adjusted. We Also Provide a Second Set of Efficiency Scores Corrected in Order to Reveal Efficiency in the Production of Tradable Goods and Rank Them. When Compared to the Rankings of Productivity Indexes Offered By Non-Frontier Studies of Hall and Jones (1996) and Islam (1995) Our Ranking Shows a Somewhat More Intuitive Order of Countries. Rankings of the Technical Change and Scale Effects Components of Tfp Change are Also Very Intuitive. We Also Show That Productivity is Responsible For Virtually All the Differences of Performance Between Developed and Developing Countries in Terms of Rates of Growth of Income Per Worker. More Important, We Find That Changes in Allocative Efficiency Play a Crucial Role in Explaining Differences in the Productivity of Developed and Developing Nations, Even Larger Than the One Played By the Technology Gap

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates whether or not multivariate cointegrated process with structural change can describe the Brazilian term structure of interest rate data from 1995 to 2006. In this work the break point and the number of cointegrated vector are assumed to be known. The estimated model has four regimes. Only three of them are statistically different. The first starts at the beginning of the sample and goes until September of 1997. The second starts at October of 1997 until December of 1998. The third starts at January of 1999 and goes until the end of the sample. It is used monthly data. Models that allows for some similarities across the regimes are also estimated and tested. The models are estimated using the Generalized Reduced-Rank Regressions developed by Hansen (2003). All imposed restrictions can be tested using likelihood ratio test with standard asymptotic 1 qui-squared distribution. The results of the paper show evidence in favor of the long run implications of the expectation hypothesis for Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quando as empresas decidem se devem ou não investir em determinado projeto de investimentos a longo prazo (horizonte de 5 a 10 anos), algumas metodologias alternativas ao Fluxo de Caixa Descontado (FCD) podem se tornar úteis tanto para confirmar a viabilidade do negócio como para indicar o melhor momento para iniciar o Empreendimento. As análises que levam em conta a incerteza dos fluxos de caixa futuros e flexibilidade na data de início do projeto podem ser construídos com a abordagem estocástica, usando metodologias como a solução de equações diferenciais que descrevem o movimento browniano. Sob determinadas condições, as oportunidades de investimentos em projetos podem ser tratados como se fossem opções reais de compra, sem data de vencimento, como no modelo proposto por McDonald-Siegel (1986), para a tomada de decisões e momento ótimo para o investimento. Este trabalho analisa a viabilidade de investimentos no mercado de telecomunicações usando modelos não determinísticos, onde a variável mais relevante é a dispersão dos retornos, ou seja, que a variância representa o risco associado a determinado empreendimento.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I study the asset-pricing implications in an cnviromncnt with feedback traders and rational arbitrageurs. Feedback traders are defined as possible naive investors who buy after a raise in prices and sell after a drop in prices. I consider two types of feedback strategies: (1) short-term (SF), motivated by institutional rulcs as top-losscs and margin calls and (2) long-tcrm (LF), motivated by representativeness bias from non-sophisticated investors. Their presence in the market follows a stochastic regime swift process. Short lived assumption for the arbitrageurs prevents the correction of the misspricing generated by feedback strategies. The estimated modcl using US data suggests that the regime switching is able to capture the time varying autocorrclation of returns. The segregation of feedback types helps to identify the long term component that otherwise would not show up due to the large movements implied by the SF typc. The paper also has normativo implications for practioners since it providos a methodology to identify mispricings driven by feedback traders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper I claim that, in a long-run perspective, measurements of income inequality, under any of the usual inequality measures used in the literature, are upward biased. The reason is that such measurements are cross-sectional by nature and, therefore, do not take into consideration the turnover in the job market which, in the long run, equalizes within-group (e.g., same-education groups) inequalities. Using a job-search model, I show how to derive the within-group invariant-distribution Gini coefficient of income inequality, how to calculate the size of the bias and how to organize the data in arder to solve the problem. Two examples are provided to illustrate the argument.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper applies to the analysis of the interstate income distribution in BraziI a set of techniques that have been widely used in the current empirical literature on growth and convergence. Usual measures of dispersion in the interstate income distribution (the coefficient of variation and Theil' s index) suggest that cr-convergence was an unequivoca1 feature of the regional growth experience in BraziI, between 1970 and 1986. After 1986, the process of convergence seems, however, to have sIowed down almost to a halt. A standard growth modeI is shown to fit the regional data well and to expIain a substantial amount of the variation in growth rates, providing estimates of the speed of (conditional) J3-convergence of approximateIy 3% p.a .. Different estimates of the long run distribution implied by the recent growth trends point towards further reductions in the interstate income inequality, but also suggest that the relative per capita incomes of a significant number of states and the number of ''very poor" and "poor" states were, in 1995, already quite c10se to their steady-state values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uma forma interessante para uma companhia que pretende assumir uma posição comprada em suas próprias ações ou lançar futuramente um programa de recompra de ações, mas sem precisar dispor de caixa ou ter que contratar um empréstimo, ou então se protegendo de uma eventual alta no preço das ações, é através da contratação de um swap de ações. Neste swap, a companhia fica ativa na variação de sua própria ação enquanto paga uma taxa de juros pré ou pós-fixada. Contudo, este tipo de swap apresenta risco wrong-way, ou seja, existe uma dependência positiva entre a ação subjacente do swap e a probabilidade de default da companhia, o que precisa ser considerado por um banco ao precificar este tipo de swap. Neste trabalho propomos um modelo para incorporar a dependência entre probabilidades de default e a exposição à contraparte no cálculo do CVA para este tipo de swap. Utilizamos um processo de Cox para modelar o instante de ocorrência de default, dado que a intensidade estocástica de default segue um modelo do tipo CIR, e assumindo que o fator aleatório presente na ação subjacente e que o fator aleatório presente na intensidade de default são dados conjuntamente por uma distribuição normal padrão bivariada. Analisamos o impacto no CVA da incorporação do riscowrong-way para este tipo de swap com diferentes contrapartes, e para diferentes prazos de vencimento e níveis de correlação.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the fundamental elements of a curricular proposal aiming at the formation of alphabetizers of young and adults, as long as it concerns the relation between the academic knowledge and that one arising from the classroom experience. The empirical field of the research comes from the work of the teachers responsible for the formation of the alphabetizers of the GerAção Cidadã Program (2004-2005) linked, as an Extension Program, to Federal University of Rio Grande do Norte. Indeed, it tries to understand how these young and adult educators makers figure out the link between the experienced knowledge, which their lives give testimony of, with the scientific knowledge, which they are entitled to mediate in class. This work is funded in the principles of the Collaborative Research, which constitutes a kind of qualitative research. It makes use of procedures supported by qualitative research, especially those ones related to reflexive sessions, as well as to documental researches and semi-structured interviews. These spaces have afforded the group of alphabetizers the opportunity to talk over their practice, not only individually but also collectively, in order to work out contributive proposals having in view changes in the educative actions. As elaborated contributions, we present a discussion about the specificities in the making of educators to EJA, in their differentiated social roles. Reflecting on the experiences of the educators makers, we highlight those elements we regard as essential to the constitution of a formation proposal, like formative times and spaces, dialogue and social memory. The curricular organization is compreneded as part of an enlarged dimension that does not restraint itself to school; rather, it is visualized as a structuring instance that connects different knowledge surpassing community and university. Under this optics, we come to the conclusion that the connection the scientific knowledge establishes with the experienced one, which is immersed in the cultural practices of those who are involved in the formative process, is the basement to a curricular proposal of a formation course destined to educators committed with the need of changing society